mirror of https://github.com/searx/searx
Compare commits
70 Commits
Author | SHA1 | Date |
---|---|---|
Noémi Ványi | 276ffd3f01 | |
Grant Lanham Jr | 75b859d2a8 | |
dependabot[bot] | 48eb13cf4c | |
searx-bot | cfec62eb7c | |
dependabot[bot] | 7c70c02220 | |
Brett Kosinski | 2fc1cd3a15 | |
searx-bot | 5e658ef276 | |
Grant Lanham Jr | 7c6a926648 | |
br4nnigan | 38606234a8 | |
searx-bot | eb39a846f3 | |
searx-bot | b15dfe0ede | |
Dr. Rolf Jansen | 9ba072bb74 | |
wibyweb | de0fde4ec2 | |
Grant Lanham Jr | 67c233d0c3 | |
ganeshlab | 2ec47dce5e | |
Émilien Devos (perso) | 8e943d858f | |
Noémi Ványi | 6ab43d1045 | |
dependabot[bot] | c647b55eb0 | |
dependabot[bot] | 8fc9ca3f01 | |
dependabot[bot] | ffc8ce4a51 | |
searx-bot | f6bafab8c4 | |
searx-bot | 07240a8109 | |
dependabot[bot] | 2612204876 | |
dependabot[bot] | bee9ff29e6 | |
dependabot[bot] | cda72d29fc | |
Dr. Rolf Jansen | a79e8194d7 | |
dependabot[bot] | 52a21d1192 | |
dependabot[bot] | 5af8f4f563 | |
dependabot[bot] | b22cd73940 | |
dependabot[bot] | 03b3ad81ec | |
ebCrypto | d993da3a7f | |
ebCrypto | ee231637a2 | |
dependabot[bot] | 117dbd462f | |
searx-bot | 4d9586e2b6 | |
searx-bot | f05572e380 | |
searx-bot | 6f15b6b477 | |
searx-bot | 8e2761dcba | |
searx-bot | f365e1f683 | |
dependabot[bot] | 806bd8045e | |
Brett Kosinski | 3c84af95ba | |
searx-bot | a9a6c58d26 | |
dependabot[bot] | 915bc3ad58 | |
dependabot[bot] | 05d8bce379 | |
searx-bot | bf0a583f4b | |
searx-bot | a8810f4813 | |
searx-bot | c8c922cad4 | |
dependabot[bot] | 5d6fe4f332 | |
dependabot[bot] | 4e073bd708 | |
dependabot[bot] | 05977f3221 | |
Noémi Ványi | a1e2c501d2 | |
Kian-Meng Ang | 629ebb426f | |
Noémi Ványi | 57e7e3bbf6 | |
Noémi Ványi | 539e1a873e | |
Adam Tauber | 31eef5b9db | |
br4nnigan | a9dadda6f7 | |
Adam Tauber | 2222caec22 | |
Markus Heiser | 1abecbc835 | |
Émilien Devos | d656b340ee | |
Rob | 9e8995e13d | |
dependabot[bot] | d471c4a3f4 | |
Elena Poelman | ca27e91594 | |
dependabot[bot] | d86cb95560 | |
Noémi Ványi | 6d40961682 | |
dependabot[bot] | 7bc0f3cc89 | |
Noémi Ványi | d004439646 | |
dependabot[bot] | 319a24317e | |
dependabot[bot] | abc877d86e | |
dependabot[bot] | 245334c1ab | |
Morten Lautrup | f199100e40 | |
Noémi Ványi | c1a611c6b9 |
|
@ -424,7 +424,7 @@ Special thanks to `NLNet <https://nlnet.nl>`__ for sponsoring multiple features
|
||||||
- Removed engines: faroo
|
- Removed engines: faroo
|
||||||
|
|
||||||
Special thanks to `NLNet <https://nlnet.nl>`__ for sponsoring multiple features of this release.
|
Special thanks to `NLNet <https://nlnet.nl>`__ for sponsoring multiple features of this release.
|
||||||
Special thanks to https://www.accessibility.nl/english for making accessibilty audit.
|
Special thanks to https://www.accessibility.nl/english for making accessibility audit.
|
||||||
|
|
||||||
News
|
News
|
||||||
~~~~
|
~~~~
|
||||||
|
|
|
@ -16,7 +16,7 @@
|
||||||
|
|
||||||
## Author's checklist
|
## Author's checklist
|
||||||
|
|
||||||
<!-- additional notes for reviewiers -->
|
<!-- additional notes for reviewers -->
|
||||||
|
|
||||||
## Related issues
|
## Related issues
|
||||||
|
|
||||||
|
|
23
README.rst
23
README.rst
|
@ -1,5 +1,7 @@
|
||||||
.. SPDX-License-Identifier: AGPL-3.0-or-later
|
.. SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
|
||||||
|
Searx is no longer maintained. Thank you for your support and all your contributions.
|
||||||
|
|
||||||
.. figure:: https://raw.githubusercontent.com/searx/searx/master/searx/static/themes/oscar/img/logo_searx_a.png
|
.. figure:: https://raw.githubusercontent.com/searx/searx/master/searx/static/themes/oscar/img/logo_searx_a.png
|
||||||
:target: https://searx.github.io/searx/
|
:target: https://searx.github.io/searx/
|
||||||
:alt: searX
|
:alt: searX
|
||||||
|
@ -73,28 +75,21 @@ Frequently asked questions
|
||||||
Is searx in maintenance mode?
|
Is searx in maintenance mode?
|
||||||
#############################
|
#############################
|
||||||
|
|
||||||
No, searx is accepting new features, including new engines. We are also adding
|
No, searx is no longer maintained.
|
||||||
engine fixes or other bug fixes when needed. Also, keep in mind that searx is
|
|
||||||
maintained by volunteers who work in their free time. So some changes might take
|
|
||||||
some time to be merged.
|
|
||||||
|
|
||||||
We reject features that might violate the privacy of users. If you really want
|
|
||||||
such a feature, it must be disabled by default and warn users about the consequances
|
|
||||||
of turning it off.
|
|
||||||
|
|
||||||
What is the difference between searx and SearxNG?
|
What is the difference between searx and SearxNG?
|
||||||
#################################################
|
#################################################
|
||||||
|
|
||||||
TL;DR: If you want to run a public instance, go with SearxNG. If you want to
|
TL;DR: SearXNG is for users that want more features and bugs getting fixed quicker.
|
||||||
self host your own instance, choose searx.
|
If you prefer a minimalist software and stable experience, use searx.
|
||||||
|
|
||||||
SearxNG is a fork of searx, created by a former maintainer of searx. The fork
|
SearxNG is a fork of searx, created by a former maintainer of searx. The fork
|
||||||
was created because the majority of the maintainers at the time did not find
|
was created because the majority of the maintainers at the time did not find
|
||||||
the new proposed features privacy respecting enough. The most significant issue is with
|
the new proposed features privacy respecting enough. The most significant issue is with
|
||||||
engine metrics.
|
engine metrics.
|
||||||
|
|
||||||
Searx is built for privacy conscious users. It comes a unique set of
|
Searx is built for privacy conscious users. It comes with a unique set of
|
||||||
challanges. One of the problems we face is that users rather not report bugs,
|
challenges. One of the problems we face is that users rather not report bugs,
|
||||||
because they do not want to publicly share what engines they use or what search
|
because they do not want to publicly share what engines they use or what search
|
||||||
query triggered a problem. It is a challenge we accepted.
|
query triggered a problem. It is a challenge we accepted.
|
||||||
|
|
||||||
|
@ -124,8 +119,8 @@ instances locally, instead of using public instances.
|
||||||
Why should I use SearxNG?
|
Why should I use SearxNG?
|
||||||
#########################
|
#########################
|
||||||
|
|
||||||
SearxNG has rolling releases, depencencies updated more frequently, and engines are fixed
|
SearxNG has rolling releases, dependencies updated more frequently, and engines are fixed
|
||||||
faster. It is easy to set up your own public instance, and monitor its
|
faster. It is easy to set up your own public instance, and monitor its
|
||||||
perfomance and metrics. It is simple to maintain as an instance adminstrator.
|
performance and metrics. It is simple to maintain as an instance administrator.
|
||||||
|
|
||||||
As a user, it provides a prettier user interface and nicer experience.
|
As a user, it provides a prettier user interface and nicer experience.
|
||||||
|
|
|
@ -100,7 +100,7 @@ update_conf() {
|
||||||
# There is a new version
|
# There is a new version
|
||||||
if [ $FORCE_CONF_UPDATE -ne 0 ]; then
|
if [ $FORCE_CONF_UPDATE -ne 0 ]; then
|
||||||
# Replace the current configuration
|
# Replace the current configuration
|
||||||
printf '⚠️ Automaticaly update %s to the new version\n' "${CONF}"
|
printf '⚠️ Automatically update %s to the new version\n' "${CONF}"
|
||||||
if [ ! -f "${OLD_CONF}" ]; then
|
if [ ! -f "${OLD_CONF}" ]; then
|
||||||
printf 'The previous configuration is saved to %s\n' "${OLD_CONF}"
|
printf 'The previous configuration is saved to %s\n' "${OLD_CONF}"
|
||||||
mv "${CONF}" "${OLD_CONF}"
|
mv "${CONF}" "${OLD_CONF}"
|
||||||
|
|
|
@ -9,7 +9,7 @@ workers = 4
|
||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
master = true
|
master = true
|
||||||
plugin = python3
|
plugin = python3
|
||||||
|
|
|
@ -0,0 +1,129 @@
|
||||||
|
=====================================
|
||||||
|
Run shell commands from your instance
|
||||||
|
=====================================
|
||||||
|
|
||||||
|
Command line engines are custom engines that run commands in the shell of the
|
||||||
|
host. In this article you can learn how to create a command engine and how to
|
||||||
|
customize the result display.
|
||||||
|
|
||||||
|
The command
|
||||||
|
===========
|
||||||
|
|
||||||
|
When specifyng commands, you must make sure the commands are available on the
|
||||||
|
searx host. Searx will not install anything for you. Also, make sure that the
|
||||||
|
``searx`` user on your host is allowed to run the selected command and has
|
||||||
|
access to the required files.
|
||||||
|
|
||||||
|
Access control
|
||||||
|
==============
|
||||||
|
|
||||||
|
Be careful when creating command engines if you are running a public
|
||||||
|
instance. Do not expose any sensitive information. You can restrict access by
|
||||||
|
configuring a list of access tokens under tokens in your ``settings.yml``.
|
||||||
|
|
||||||
|
Available settings
|
||||||
|
==================
|
||||||
|
|
||||||
|
* ``command``: A comma separated list of the elements of the command. A special
|
||||||
|
token ``{{QUERY}}`` tells searx where to put the search terms of the
|
||||||
|
user. Example: ``['ls', '-l', '-h', '{{QUERY}}']``
|
||||||
|
* ``query_type``: The expected type of user search terms. Possible values:
|
||||||
|
``path`` and ``enum``. ``path`` checks if the uesr provided path is inside the
|
||||||
|
working directory. If not the query is not executed. ``enum`` is a list of
|
||||||
|
allowed search terms. If the user submits something which is not included in
|
||||||
|
the list, the query returns an error.
|
||||||
|
* ``delimiter``: A dict containing a delimiter char and the "titles" of each
|
||||||
|
element in keys.
|
||||||
|
* ``parse_regex``: A dict containing the regular expressions for each result
|
||||||
|
key.
|
||||||
|
* ``query_enum``: A list containing allowed search terms if ``query_type`` is
|
||||||
|
set to ``enum``.
|
||||||
|
* ``working_dir``: The directory where the command has to be executed. Default:
|
||||||
|
``.``
|
||||||
|
* ``result_separator``: The character that separates results. Default: ``\n``
|
||||||
|
|
||||||
|
Customize the result template
|
||||||
|
=============================
|
||||||
|
|
||||||
|
There is a default result template for displaying key-value pairs coming from
|
||||||
|
command engines. If you want something more tailored to your result types, you
|
||||||
|
can design your own template.
|
||||||
|
|
||||||
|
Searx relies on `Jinja2 <https://jinja.palletsprojects.com/>`_ for
|
||||||
|
templating. If you are familiar with Jinja, you will not have any issues
|
||||||
|
creating templates. You can access the result attributes with ``{{
|
||||||
|
result.attribute_name }}``.
|
||||||
|
|
||||||
|
In the example below the result has two attributes: ``header`` and ``content``.
|
||||||
|
To customize their diplay, you need the following template (you must define
|
||||||
|
these classes yourself):
|
||||||
|
|
||||||
|
.. code:: html
|
||||||
|
|
||||||
|
<div class="result">
|
||||||
|
<div class="result-header">
|
||||||
|
{{ result.header }}
|
||||||
|
</div>
|
||||||
|
<div class="result-content">
|
||||||
|
{{ result.content }}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
Then put your template under ``searx/templates/{theme-name}/result_templates``
|
||||||
|
named ``your-template-name.html``. You can select your custom template with the
|
||||||
|
option ``result_template``.
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name: your engine name
|
||||||
|
engine: command
|
||||||
|
result_template: your-template-name.html
|
||||||
|
|
||||||
|
Examples
|
||||||
|
========
|
||||||
|
|
||||||
|
Find files by name
|
||||||
|
------------------
|
||||||
|
|
||||||
|
The first example is to find files on your searx host. It uses the command
|
||||||
|
`find` available on most Linux distributions. It expects a path type query. The
|
||||||
|
path in the search request must be inside the ``working_dir``.
|
||||||
|
|
||||||
|
The results are displayed with the default `key-value.html` template. A result
|
||||||
|
is displayed in a single row table with the key "line".
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : find
|
||||||
|
engine : command
|
||||||
|
command : ['find', '.', '-name', '{{QUERY}}']
|
||||||
|
query_type : path
|
||||||
|
shortcut : fnd
|
||||||
|
tokens : []
|
||||||
|
disabled : True
|
||||||
|
delimiter :
|
||||||
|
chars : ' '
|
||||||
|
keys : ['line']
|
||||||
|
|
||||||
|
|
||||||
|
Find files by contents
|
||||||
|
-----------------------
|
||||||
|
|
||||||
|
In the second example, we define an engine that searches in the contents of the
|
||||||
|
files under the ``working_dir``. The search type is not defined, so the user can
|
||||||
|
input any string they want. To restrict the input, you can set the ``query_type``
|
||||||
|
to ``enum`` and only allow a set of search terms to protect
|
||||||
|
yourself. Alternatively, make the engine private, so no one malevolent accesses
|
||||||
|
the engine.
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : regex search in files
|
||||||
|
engine : command
|
||||||
|
command : ['grep', '{{QUERY}}']
|
||||||
|
shortcut : gr
|
||||||
|
tokens : []
|
||||||
|
disabled : True
|
||||||
|
delimiter :
|
||||||
|
chars : ' '
|
||||||
|
keys : ['line']
|
|
@ -37,7 +37,7 @@ Disabled **D** Engine type **ET**
|
||||||
------------- ----------- -------------------- ------------
|
------------- ----------- -------------------- ------------
|
||||||
Safe search **SS**
|
Safe search **SS**
|
||||||
------------- ----------- ---------------------------------
|
------------- ----------- ---------------------------------
|
||||||
Weigth **W**
|
Weight **W**
|
||||||
------------- ----------- ---------------------------------
|
------------- ----------- ---------------------------------
|
||||||
Disabled **D**
|
Disabled **D**
|
||||||
------------- ----------- ---------------------------------
|
------------- ----------- ---------------------------------
|
||||||
|
@ -86,3 +86,60 @@ Show errors **DE**
|
||||||
|
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
||||||
|
.. flat-table:: Additional engines (commented out in settings.yml)
|
||||||
|
:header-rows: 1
|
||||||
|
:stub-columns: 2
|
||||||
|
|
||||||
|
* - Name
|
||||||
|
- Base URL
|
||||||
|
- Host
|
||||||
|
- Port
|
||||||
|
- Paging
|
||||||
|
|
||||||
|
* - elasticsearch
|
||||||
|
- localhost:9200
|
||||||
|
-
|
||||||
|
-
|
||||||
|
- False
|
||||||
|
|
||||||
|
* - meilicsearch
|
||||||
|
- localhost:7700
|
||||||
|
-
|
||||||
|
-
|
||||||
|
- True
|
||||||
|
|
||||||
|
* - mongodb
|
||||||
|
-
|
||||||
|
- 127.0.0.1
|
||||||
|
- 21017
|
||||||
|
- True
|
||||||
|
|
||||||
|
* - mysql_server
|
||||||
|
-
|
||||||
|
- 127.0.0.1
|
||||||
|
- 3306
|
||||||
|
- True
|
||||||
|
|
||||||
|
* - postgresql
|
||||||
|
-
|
||||||
|
- 127.0.0.1
|
||||||
|
- 5432
|
||||||
|
- True
|
||||||
|
|
||||||
|
* - redis_server
|
||||||
|
-
|
||||||
|
- 127.0.0.1
|
||||||
|
- 6379
|
||||||
|
- False
|
||||||
|
|
||||||
|
* - solr
|
||||||
|
- localhost:8983
|
||||||
|
-
|
||||||
|
-
|
||||||
|
- True
|
||||||
|
|
||||||
|
* - sqlite
|
||||||
|
-
|
||||||
|
-
|
||||||
|
-
|
||||||
|
- True
|
||||||
|
|
|
@ -39,7 +39,7 @@ Example
|
||||||
Scenario:
|
Scenario:
|
||||||
|
|
||||||
#. Recoll indexes a local filesystem mounted in ``/export/documents/reference``,
|
#. Recoll indexes a local filesystem mounted in ``/export/documents/reference``,
|
||||||
#. the Recoll search inteface can be reached at https://recoll.example.org/ and
|
#. the Recoll search interface can be reached at https://recoll.example.org/ and
|
||||||
#. the contents of this filesystem can be reached though https://download.example.org/reference
|
#. the contents of this filesystem can be reached though https://download.example.org/reference
|
||||||
|
|
||||||
.. code:: yaml
|
.. code:: yaml
|
||||||
|
|
|
@ -19,5 +19,9 @@ Administrator documentation
|
||||||
filtron
|
filtron
|
||||||
morty
|
morty
|
||||||
engines
|
engines
|
||||||
|
private-engines
|
||||||
|
command-engine
|
||||||
|
indexer-engines
|
||||||
|
no-sql-engines
|
||||||
plugins
|
plugins
|
||||||
buildhosts
|
buildhosts
|
||||||
|
|
|
@ -0,0 +1,89 @@
|
||||||
|
==================
|
||||||
|
Search in indexers
|
||||||
|
==================
|
||||||
|
|
||||||
|
Searx supports three popular indexer search engines:
|
||||||
|
|
||||||
|
* Elasticsearch
|
||||||
|
* Meilisearch
|
||||||
|
* Solr
|
||||||
|
|
||||||
|
Elasticsearch
|
||||||
|
=============
|
||||||
|
|
||||||
|
Make sure that the Elasticsearch user has access to the index you are querying.
|
||||||
|
If you are not using TLS during your connection, set ``enable_http`` to ``True``.
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : elasticsearch
|
||||||
|
shortcut : es
|
||||||
|
engine : elasticsearch
|
||||||
|
base_url : http://localhost:9200
|
||||||
|
username : elastic
|
||||||
|
password : changeme
|
||||||
|
index : my-index
|
||||||
|
query_type : match
|
||||||
|
enable_http : True
|
||||||
|
|
||||||
|
Available settings
|
||||||
|
------------------
|
||||||
|
|
||||||
|
* ``base_url``: URL of Elasticsearch instance. By default it is set to ``http://localhost:9200``.
|
||||||
|
* ``index``: Name of the index to query. Required.
|
||||||
|
* ``query_type``: Elasticsearch query method to use. Available: ``match``,
|
||||||
|
``simple_query_string``, ``term``, ``terms``, ``custom``.
|
||||||
|
* ``custom_query_json``: If you selected ``custom`` for ``query_type``, you must
|
||||||
|
provide the JSON payload in this option.
|
||||||
|
* ``username``: Username in Elasticsearch
|
||||||
|
* ``password``: Password for the Elasticsearch user
|
||||||
|
|
||||||
|
Meilisearch
|
||||||
|
===========
|
||||||
|
|
||||||
|
If you are not using TLS during connection, set ``enable_http`` to ``True``.
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : meilisearch
|
||||||
|
engine : meilisearch
|
||||||
|
shortcut: mes
|
||||||
|
base_url : http://localhost:7700
|
||||||
|
index : my-index
|
||||||
|
enable_http: True
|
||||||
|
|
||||||
|
Available settings
|
||||||
|
------------------
|
||||||
|
|
||||||
|
* ``base_url``: URL of the Meilisearch instance. By default it is set to http://localhost:7700
|
||||||
|
* ``index``: Name of the index to query. Required.
|
||||||
|
* ``auth_key``: Key required for authentication.
|
||||||
|
* ``facet_filters``: List of facets to search in.
|
||||||
|
|
||||||
|
Solr
|
||||||
|
====
|
||||||
|
|
||||||
|
If you are not using TLS during connection, set ``enable_http`` to ``True``.
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : solr
|
||||||
|
engine : solr
|
||||||
|
shortcut : slr
|
||||||
|
base_url : http://localhost:8983
|
||||||
|
collection : my-collection
|
||||||
|
sort : asc
|
||||||
|
enable_http : True
|
||||||
|
|
||||||
|
Available settings
|
||||||
|
------------------
|
||||||
|
|
||||||
|
* ``base_url``: URL of the Meilisearch instance. By default it is set to http://localhost:8983
|
||||||
|
* ``collection``: Name of the collection to query. Required.
|
||||||
|
* ``sort``: Sorting of the results. Available: ``asc``, ``desc``.
|
||||||
|
* ``rows``: Maximum number of results from a query. Default value: 10.
|
||||||
|
* ``field_list``: List of fields returned from the query.
|
||||||
|
* ``default_fields``: Default fields to query.
|
||||||
|
* ``query_fields``: List of fields with a boost factor. The bigger the boost
|
||||||
|
factor of a field, the more important the field is in the query. Example:
|
||||||
|
``qf="field1^2.3 field2"``
|
|
@ -94,8 +94,8 @@ My experience is, that this command is a bit buggy.
|
||||||
|
|
||||||
.. _uwsgi configuration:
|
.. _uwsgi configuration:
|
||||||
|
|
||||||
Alltogether
|
All together
|
||||||
===========
|
============
|
||||||
|
|
||||||
Create the configuration ini-file according to your distribution (see below) and
|
Create the configuration ini-file according to your distribution (see below) and
|
||||||
restart the uwsgi application.
|
restart the uwsgi application.
|
||||||
|
|
|
@ -0,0 +1,170 @@
|
||||||
|
===========================
|
||||||
|
Query SQL and NoSQL servers
|
||||||
|
===========================
|
||||||
|
|
||||||
|
SQL
|
||||||
|
===
|
||||||
|
|
||||||
|
SQL servers are traditional databases with predefined data schema. Furthermore,
|
||||||
|
modern versions also support BLOB data.
|
||||||
|
|
||||||
|
You can search in the following servers:
|
||||||
|
|
||||||
|
* `PostgreSQL`_
|
||||||
|
* `MySQL`_
|
||||||
|
* `SQLite`_
|
||||||
|
|
||||||
|
The configuration of the new database engines are similar. You must put a valid
|
||||||
|
SELECT SQL query in ``query_str``. At the moment you can only bind at most
|
||||||
|
one parameter in your query.
|
||||||
|
|
||||||
|
Do not include LIMIT or OFFSET in your SQL query as the engines
|
||||||
|
rely on these keywords during paging.
|
||||||
|
|
||||||
|
PostgreSQL
|
||||||
|
----------
|
||||||
|
|
||||||
|
Required PyPi package: ``psychopg2``
|
||||||
|
|
||||||
|
You can find an example configuration below:
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : postgresql
|
||||||
|
engine : postgresql
|
||||||
|
database : my_database
|
||||||
|
username : searx
|
||||||
|
password : password
|
||||||
|
query_str : 'SELECT * from my_table WHERE my_column = %(query)s'
|
||||||
|
shortcut : psql
|
||||||
|
|
||||||
|
|
||||||
|
Available options
|
||||||
|
~~~~~~~~~~~~~~~~~
|
||||||
|
* ``host``: IP address of the host running PostgreSQL. By default it is ``127.0.0.1``.
|
||||||
|
* ``port``: Port number PostgreSQL is listening on. By default it is ``5432``.
|
||||||
|
* ``database``: Name of the database you are connecting to.
|
||||||
|
* ``username``: Name of the user connecting to the database.
|
||||||
|
* ``password``: Password of the database user.
|
||||||
|
* ``query_str``: Query string to run. Keywords like ``LIMIT`` and ``OFFSET`` are not allowed. Required.
|
||||||
|
* ``limit``: Number of returned results per page. By default it is 10.
|
||||||
|
|
||||||
|
MySQL
|
||||||
|
-----
|
||||||
|
|
||||||
|
Required PyPi package: ``mysql-connector-python``
|
||||||
|
|
||||||
|
This is an example configuration for quering a MySQL server:
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : mysql
|
||||||
|
engine : mysql_server
|
||||||
|
database : my_database
|
||||||
|
username : searx
|
||||||
|
password : password
|
||||||
|
limit : 5
|
||||||
|
query_str : 'SELECT * from my_table WHERE my_column=%(query)s'
|
||||||
|
shortcut : mysql
|
||||||
|
|
||||||
|
|
||||||
|
Available options
|
||||||
|
~~~~~~~~~~~~~~~~~
|
||||||
|
* ``host``: IP address of the host running MySQL. By default it is ``127.0.0.1``.
|
||||||
|
* ``port``: Port number MySQL is listening on. By default it is ``3306``.
|
||||||
|
* ``database``: Name of the database you are connecting to.
|
||||||
|
* ``auth_plugin``: Authentication plugin to use. By default it is ``caching_sha2_password``.
|
||||||
|
* ``username``: Name of the user connecting to the database.
|
||||||
|
* ``password``: Password of the database user.
|
||||||
|
* ``query_str``: Query string to run. Keywords like ``LIMIT`` and ``OFFSET`` are not allowed. Required.
|
||||||
|
* ``limit``: Number of returned results per page. By default it is 10.
|
||||||
|
|
||||||
|
SQLite
|
||||||
|
------
|
||||||
|
|
||||||
|
You can read from your database ``my_database`` using this example configuration:
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : sqlite
|
||||||
|
engine : sqlite
|
||||||
|
shortcut: sq
|
||||||
|
database : my_database
|
||||||
|
query_str : 'SELECT * FROM my_table WHERE my_column=:query'
|
||||||
|
|
||||||
|
|
||||||
|
Available options
|
||||||
|
~~~~~~~~~~~~~~~~~
|
||||||
|
* ``database``: Name of the database you are connecting to.
|
||||||
|
* ``query_str``: Query string to run. Keywords like ``LIMIT`` and ``OFFSET`` are not allowed. Required.
|
||||||
|
* ``limit``: Number of returned results per page. By default it is 10.
|
||||||
|
|
||||||
|
NoSQL
|
||||||
|
=====
|
||||||
|
|
||||||
|
NoSQL data stores are used for storing arbitrary data without first defining their
|
||||||
|
structure. To query the supported servers, you must install their drivers using PyPi.
|
||||||
|
|
||||||
|
You can search in the following servers:
|
||||||
|
|
||||||
|
* `Redis`_
|
||||||
|
* `MongoDB`_
|
||||||
|
|
||||||
|
Redis
|
||||||
|
-----
|
||||||
|
|
||||||
|
Reqired PyPi package: ``redis``
|
||||||
|
|
||||||
|
Example configuration:
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : mystore
|
||||||
|
engine : redis_server
|
||||||
|
exact_match_only : True
|
||||||
|
host : 127.0.0.1
|
||||||
|
port : 6379
|
||||||
|
password : secret-password
|
||||||
|
db : 0
|
||||||
|
shortcut : rds
|
||||||
|
enable_http : True
|
||||||
|
|
||||||
|
Available options
|
||||||
|
~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
* ``host``: IP address of the host running Redis. By default it is ``127.0.0.1``.
|
||||||
|
* ``port``: Port number Redis is listening on. By default it is ``6379``.
|
||||||
|
* ``password``: Password if required by Redis.
|
||||||
|
* ``db``: Number of the database you are connecting to.
|
||||||
|
* ``exact_match_only``: Enable if you need exact matching. By default it is ``True``.
|
||||||
|
|
||||||
|
|
||||||
|
MongoDB
|
||||||
|
-------
|
||||||
|
|
||||||
|
Required PyPi package: ``pymongo``
|
||||||
|
|
||||||
|
Below is an example configuration for using a MongoDB collection:
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name : mymongo
|
||||||
|
engine : mongodb
|
||||||
|
shortcut : icm
|
||||||
|
host : '127.0.0.1'
|
||||||
|
port : 27017
|
||||||
|
database : personal
|
||||||
|
collection : income
|
||||||
|
key : month
|
||||||
|
enable_http: True
|
||||||
|
|
||||||
|
|
||||||
|
Available options
|
||||||
|
~~~~~~~~~~~~~~~~~
|
||||||
|
|
||||||
|
* ``host``: IP address of the host running MongoDB. By default it is ``127.0.0.1``.
|
||||||
|
* ``port``: Port number MongoDB is listening on. By default it is ``27017``.
|
||||||
|
* ``password``: Password if required by Redis.
|
||||||
|
* ``database``: Name of the database you are connecting to.
|
||||||
|
* ``collection``: Name of the collection you want to search in.
|
||||||
|
* ``exact_match_only``: Enable if you need exact matching. By default it is ``True``.
|
Binary file not shown.
After Width: | Height: | Size: 74 KiB |
|
@ -0,0 +1,44 @@
|
||||||
|
=============================
|
||||||
|
How to create private engines
|
||||||
|
=============================
|
||||||
|
|
||||||
|
If you are running your public searx instance, you might want to restrict access
|
||||||
|
to some engines. Maybe you are afraid of bots might abusing the engine. Or the
|
||||||
|
engine might return private results you do not want to share with strangers.
|
||||||
|
|
||||||
|
Server side configuration
|
||||||
|
=========================
|
||||||
|
|
||||||
|
You can make any engine private by setting a list of tokens in your settings.yml
|
||||||
|
file. In the following example, we set two different tokens that provide access
|
||||||
|
to the engine.
|
||||||
|
|
||||||
|
.. code:: yaml
|
||||||
|
|
||||||
|
- name: my-private-google
|
||||||
|
engine: google
|
||||||
|
shortcut: pgo
|
||||||
|
tokens: ['my-secret-token-1', 'my-secret-token-2']
|
||||||
|
|
||||||
|
|
||||||
|
To access the private engine, you must distribute the tokens to your searx
|
||||||
|
users. It is up to you how you let them know what the access token is you
|
||||||
|
created.
|
||||||
|
|
||||||
|
Client side configuration
|
||||||
|
=========================
|
||||||
|
|
||||||
|
As a searx instance user, you can add any number of access tokens on the
|
||||||
|
Preferences page. You have to set a comma separated lists of strings in "Engine
|
||||||
|
tokens" input, then save your new preferences.
|
||||||
|
|
||||||
|
.. image:: prefernces-private.png
|
||||||
|
:width: 600px
|
||||||
|
:align: center
|
||||||
|
:alt: location of token textarea
|
||||||
|
|
||||||
|
Once the Preferences page is loaded again, you can see the information of the
|
||||||
|
private engines you got access to. If you cannot see the expected engines in the
|
||||||
|
engines list, double check your token. If there is no issue with the token,
|
||||||
|
contact your instance administrator.
|
||||||
|
|
|
@ -129,7 +129,7 @@ Global Settings
|
||||||
outgoing: # communication with search engines
|
outgoing: # communication with search engines
|
||||||
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
||||||
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
||||||
useragent_suffix : "" # informations like an email address to the administrator
|
useragent_suffix : "" # information like an email address to the administrator
|
||||||
pool_connections : 100 # Number of different hosts
|
pool_connections : 100 # Number of different hosts
|
||||||
pool_maxsize : 10 # Number of simultaneous requests by host
|
pool_maxsize : 10 # Number of simultaneous requests by host
|
||||||
# uncomment below section if you want to use a proxy
|
# uncomment below section if you want to use a proxy
|
||||||
|
|
|
@ -0,0 +1,48 @@
|
||||||
|
=================================
|
||||||
|
Private searx project is finished
|
||||||
|
=================================
|
||||||
|
|
||||||
|
We are officially finished with the Private searx project. The goal was to
|
||||||
|
extend searx capabilities beyond just searching on the Internet. We added
|
||||||
|
support for offline engines. These engines do not connect to the Internet,
|
||||||
|
they find results locally.
|
||||||
|
|
||||||
|
As some of the offline engines run commands on the searx host, we added an
|
||||||
|
option to protect any engine by making them private. Private engines can only be
|
||||||
|
accessed using a token.
|
||||||
|
|
||||||
|
After searx was prepared to run offline queries we added numerous new engines:
|
||||||
|
|
||||||
|
1. Command line engine
|
||||||
|
2. MySQL
|
||||||
|
3. PostgreSQL
|
||||||
|
4. SQLite
|
||||||
|
5. Redis
|
||||||
|
6. MongoDB
|
||||||
|
|
||||||
|
We also added new engines that communicate over HTTP, but you might want to keep
|
||||||
|
them private:
|
||||||
|
|
||||||
|
1. Elasticsearch
|
||||||
|
2. Meilisearch
|
||||||
|
3. Solr
|
||||||
|
|
||||||
|
The last step was to document this work. We added new tutorials on creating
|
||||||
|
command engines, making engines private and also adding a custom result template
|
||||||
|
to your own engines.
|
||||||
|
|
||||||
|
Acknowledgement
|
||||||
|
===============
|
||||||
|
|
||||||
|
The project was sponsored by `Search and Discovery Fund`_ of `NLnet
|
||||||
|
Foundation`_. We would like to thank the NLnet for not only the funds, but the
|
||||||
|
conversations and their ideas. They were truly invested and passionate about
|
||||||
|
supporting searx.
|
||||||
|
|
||||||
|
.. _Search and Discovery Fund: https://nlnet.nl/discovery
|
||||||
|
.. _NLnet Foundation: https://nlnet.nl/
|
||||||
|
|
||||||
|
|
||||||
|
| Happy hacking.
|
||||||
|
| kvch // 2022.09.30 23:15
|
||||||
|
|
|
@ -15,3 +15,4 @@ Blog
|
||||||
search-indexer-engines
|
search-indexer-engines
|
||||||
sql-engines
|
sql-engines
|
||||||
search-database-engines
|
search-database-engines
|
||||||
|
documentation-offline-engines
|
||||||
|
|
|
@ -207,7 +207,7 @@ debug services from filtron and morty analogous use:
|
||||||
Another point we have to notice is that each service (:ref:`searx <searx.sh>`,
|
Another point we have to notice is that each service (:ref:`searx <searx.sh>`,
|
||||||
:ref:`filtron <filtron.sh>` and :ref:`morty <morty.sh>`) runs under dedicated
|
:ref:`filtron <filtron.sh>` and :ref:`morty <morty.sh>`) runs under dedicated
|
||||||
system user account with the same name (compare :ref:`create searx user`). To
|
system user account with the same name (compare :ref:`create searx user`). To
|
||||||
get a shell from theses accounts, simply call one of the scripts:
|
get a shell from these accounts, simply call one of the scripts:
|
||||||
|
|
||||||
.. tabs::
|
.. tabs::
|
||||||
|
|
||||||
|
@ -311,7 +311,7 @@ of the container:
|
||||||
|
|
||||||
Now we can develop as usual in the working tree of our desktop system. Every
|
Now we can develop as usual in the working tree of our desktop system. Every
|
||||||
time the software was changed, you have to restart the searx service (in the
|
time the software was changed, you have to restart the searx service (in the
|
||||||
conatiner):
|
container):
|
||||||
|
|
||||||
.. tabs::
|
.. tabs::
|
||||||
|
|
||||||
|
@ -370,7 +370,7 @@ We build up a fully functional searx suite in a archlinux container:
|
||||||
$ sudo -H ./utils/lxc.sh install suite searx-archlinux
|
$ sudo -H ./utils/lxc.sh install suite searx-archlinux
|
||||||
|
|
||||||
To access HTTP from the desktop we installed nginx for the services inside the
|
To access HTTP from the desktop we installed nginx for the services inside the
|
||||||
conatiner:
|
container:
|
||||||
|
|
||||||
.. tabs::
|
.. tabs::
|
||||||
|
|
||||||
|
|
|
@ -16,7 +16,7 @@ you can use your owm template by placing the template under
|
||||||
``searx/templates/{theme_name}/result_templates/{template_name}`` and setting
|
``searx/templates/{theme_name}/result_templates/{template_name}`` and setting
|
||||||
``result_template`` attribute to ``{template_name}``.
|
``result_template`` attribute to ``{template_name}``.
|
||||||
|
|
||||||
Futhermore, if you do not want to expose these engines on a public instance, you can
|
Furthermore, if you do not want to expose these engines on a public instance, you can
|
||||||
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
||||||
private engines`_.
|
private engines`_.
|
||||||
|
|
||||||
|
@ -29,7 +29,7 @@ structure.
|
||||||
Redis
|
Redis
|
||||||
-----
|
-----
|
||||||
|
|
||||||
Reqired package: ``redis``
|
Required package: ``redis``
|
||||||
|
|
||||||
Redis is a key value based data store usually stored in memory.
|
Redis is a key value based data store usually stored in memory.
|
||||||
|
|
||||||
|
|
|
@ -15,7 +15,7 @@ All of the engines above are added to ``settings.yml`` just commented out, as yo
|
||||||
Please note that if you are not using HTTPS to access these engines, you have to enable
|
Please note that if you are not using HTTPS to access these engines, you have to enable
|
||||||
HTTP requests by setting ``enable_http`` to ``True``.
|
HTTP requests by setting ``enable_http`` to ``True``.
|
||||||
|
|
||||||
Futhermore, if you do not want to expose these engines on a public instance, you can
|
Furthermore, if you do not want to expose these engines on a public instance, you can
|
||||||
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
||||||
private engines`_.
|
private engines`_.
|
||||||
|
|
||||||
|
@ -57,7 +57,7 @@ small-scale (less than 10 million documents) data collections. E.g. it is great
|
||||||
web pages you have visited and searching in the contents later.
|
web pages you have visited and searching in the contents later.
|
||||||
|
|
||||||
The engine supports faceted search, so you can search in a subset of documents of the collection.
|
The engine supports faceted search, so you can search in a subset of documents of the collection.
|
||||||
Futhermore, you can search in Meilisearch instances that require authentication by setting ``auth_token``.
|
Furthermore, you can search in Meilisearch instances that require authentication by setting ``auth_token``.
|
||||||
|
|
||||||
Here is a simple example to query a Meilisearch instance:
|
Here is a simple example to query a Meilisearch instance:
|
||||||
|
|
||||||
|
|
|
@ -62,7 +62,7 @@ Before enabling MySQL engine, you must install the package ``mysql-connector-pyt
|
||||||
The authentication plugin is configurable by setting ``auth_plugin`` in the attributes.
|
The authentication plugin is configurable by setting ``auth_plugin`` in the attributes.
|
||||||
By default it is set to ``caching_sha2_password``.
|
By default it is set to ``caching_sha2_password``.
|
||||||
|
|
||||||
This is an example configuration for quering a MySQL server:
|
This is an example configuration for querying a MySQL server:
|
||||||
|
|
||||||
.. code:: yaml
|
.. code:: yaml
|
||||||
|
|
||||||
|
|
|
@ -10,7 +10,7 @@ from searx.version import VERSION_STRING
|
||||||
# Project --------------------------------------------------------------
|
# Project --------------------------------------------------------------
|
||||||
|
|
||||||
project = u'searx'
|
project = u'searx'
|
||||||
copyright = u'2015-2021, Adam Tauber, Noémi Ványi'
|
copyright = u'2015-2022, Adam Tauber, Noémi Ványi'
|
||||||
author = u'Adam Tauber'
|
author = u'Adam Tauber'
|
||||||
release, version = VERSION_STRING, VERSION_STRING
|
release, version = VERSION_STRING, VERSION_STRING
|
||||||
highlight_language = 'none'
|
highlight_language = 'none'
|
||||||
|
@ -101,13 +101,11 @@ imgmath_font_size = 14
|
||||||
|
|
||||||
html_theme_options = {"index_sidebar_logo": True}
|
html_theme_options = {"index_sidebar_logo": True}
|
||||||
html_context = {"project_links": [] }
|
html_context = {"project_links": [] }
|
||||||
html_context["project_links"].append(ProjectLink("Blog", "blog/index.html"))
|
html_context["project_links"].append(ProjectLink("Blog", brand.DOCS_URL + "/blog/index.html"))
|
||||||
if brand.GIT_URL:
|
if brand.GIT_URL:
|
||||||
html_context["project_links"].append(ProjectLink("Source", brand.GIT_URL))
|
html_context["project_links"].append(ProjectLink("Source", brand.GIT_URL))
|
||||||
if brand.WIKI_URL:
|
if brand.WIKI_URL:
|
||||||
html_context["project_links"].append(ProjectLink("Wiki", brand.WIKI_URL))
|
html_context["project_links"].append(ProjectLink("Wiki", brand.WIKI_URL))
|
||||||
if brand.PUBLIC_INSTANCES:
|
|
||||||
html_context["project_links"].append(ProjectLink("Public instances", brand.PUBLIC_INSTANCES))
|
|
||||||
if brand.TWITTER_URL:
|
if brand.TWITTER_URL:
|
||||||
html_context["project_links"].append(ProjectLink("Twitter", brand.TWITTER_URL))
|
html_context["project_links"].append(ProjectLink("Twitter", brand.TWITTER_URL))
|
||||||
if brand.ISSUE_URL:
|
if brand.ISSUE_URL:
|
||||||
|
|
|
@ -41,7 +41,7 @@ engine file
|
||||||
argument type information
|
argument type information
|
||||||
======================= =========== ========================================================
|
======================= =========== ========================================================
|
||||||
categories list pages, in which the engine is working
|
categories list pages, in which the engine is working
|
||||||
paging boolean support multible pages
|
paging boolean support multiple pages
|
||||||
time_range_support boolean support search time range
|
time_range_support boolean support search time range
|
||||||
engine_type str ``online`` by default, other possibles values are
|
engine_type str ``online`` by default, other possibles values are
|
||||||
``offline``, ``online_dictionary``, ``online_currency``
|
``offline``, ``online_dictionary``, ``online_currency``
|
||||||
|
@ -159,7 +159,7 @@ parsed arguments
|
||||||
----------------
|
----------------
|
||||||
|
|
||||||
The function ``def request(query, params):`` always returns the ``params``
|
The function ``def request(query, params):`` always returns the ``params``
|
||||||
variable. Inside searx, the following paramters can be used to specify a search
|
variable. Inside searx, the following parameters can be used to specify a search
|
||||||
request:
|
request:
|
||||||
|
|
||||||
=================== =========== ==========================================================================
|
=================== =========== ==========================================================================
|
||||||
|
|
|
@ -15,7 +15,7 @@ generated and deployed at :docs:`github.io <.>`. For build prerequisites read
|
||||||
:ref:`docs build`.
|
:ref:`docs build`.
|
||||||
|
|
||||||
The source files of Searx's documentation are located at :origin:`docs`. Sphinx
|
The source files of Searx's documentation are located at :origin:`docs`. Sphinx
|
||||||
assumes source files to be encoded in UTF-8 by defaul. Run :ref:`make docs.live
|
assumes source files to be encoded in UTF-8 by default. Run :ref:`make docs.live
|
||||||
<make docs.live>` to build HTML while editing.
|
<make docs.live>` to build HTML while editing.
|
||||||
|
|
||||||
.. sidebar:: Further reading
|
.. sidebar:: Further reading
|
||||||
|
@ -227,13 +227,13 @@ To refer anchors use the `ref role`_ markup:
|
||||||
|
|
||||||
.. code:: reST
|
.. code:: reST
|
||||||
|
|
||||||
Visit chapter :ref:`reST anchor`. Or set hyperlink text manualy :ref:`foo
|
Visit chapter :ref:`reST anchor`. Or set hyperlink text manually :ref:`foo
|
||||||
bar <reST anchor>`.
|
bar <reST anchor>`.
|
||||||
|
|
||||||
.. admonition:: ``:ref:`` role
|
.. admonition:: ``:ref:`` role
|
||||||
:class: rst-example
|
:class: rst-example
|
||||||
|
|
||||||
Visist chapter :ref:`reST anchor`. Or set hyperlink text manualy :ref:`foo
|
Visist chapter :ref:`reST anchor`. Or set hyperlink text manually :ref:`foo
|
||||||
bar <reST anchor>`.
|
bar <reST anchor>`.
|
||||||
|
|
||||||
.. _reST ordinary ref:
|
.. _reST ordinary ref:
|
||||||
|
@ -494,8 +494,8 @@ Figures & Images
|
||||||
is flexible. To get best results in the generated output format, install
|
is flexible. To get best results in the generated output format, install
|
||||||
ImageMagick_ and Graphviz_.
|
ImageMagick_ and Graphviz_.
|
||||||
|
|
||||||
Searx's sphinx setup includes: :ref:`linuxdoc:kfigure`. Scaleable here means;
|
Searx's sphinx setup includes: :ref:`linuxdoc:kfigure`. Scalable here means;
|
||||||
scaleable in sense of the build process. Normally in absence of a converter
|
scalable in sense of the build process. Normally in absence of a converter
|
||||||
tool, the build process will break. From the authors POV it’s annoying to care
|
tool, the build process will break. From the authors POV it’s annoying to care
|
||||||
about the build process when handling with images, especially since he has no
|
about the build process when handling with images, especially since he has no
|
||||||
access to the build process. With :ref:`linuxdoc:kfigure` the build process
|
access to the build process. With :ref:`linuxdoc:kfigure` the build process
|
||||||
|
@ -503,7 +503,7 @@ continues and scales output quality in dependence of installed image processors.
|
||||||
|
|
||||||
If you want to add an image, you should use the ``kernel-figure`` (inheritance
|
If you want to add an image, you should use the ``kernel-figure`` (inheritance
|
||||||
of :dudir:`figure`) and ``kernel-image`` (inheritance of :dudir:`image`)
|
of :dudir:`figure`) and ``kernel-image`` (inheritance of :dudir:`image`)
|
||||||
directives. E.g. to insert a figure with a scaleable image format use SVG
|
directives. E.g. to insert a figure with a scalable image format use SVG
|
||||||
(:ref:`svg image example`):
|
(:ref:`svg image example`):
|
||||||
|
|
||||||
.. code:: reST
|
.. code:: reST
|
||||||
|
@ -1185,7 +1185,7 @@ and *targets* (e.g. a ref to :ref:`row 2 of table's body <row body 2>`).
|
||||||
- cell 4.4
|
- cell 4.4
|
||||||
|
|
||||||
* - row 5
|
* - row 5
|
||||||
- cell 5.1 with automatic span to rigth end
|
- cell 5.1 with automatic span to right end
|
||||||
|
|
||||||
* - row 6
|
* - row 6
|
||||||
- cell 6.1
|
- cell 6.1
|
||||||
|
@ -1237,7 +1237,7 @@ and *targets* (e.g. a ref to :ref:`row 2 of table's body <row body 2>`).
|
||||||
- cell 4.4
|
- cell 4.4
|
||||||
|
|
||||||
* - row 5
|
* - row 5
|
||||||
- cell 5.1 with automatic span to rigth end
|
- cell 5.1 with automatic span to right end
|
||||||
|
|
||||||
* - row 6
|
* - row 6
|
||||||
- cell 6.1
|
- cell 6.1
|
||||||
|
|
|
@ -8,9 +8,6 @@ Searx is a free internet metasearch engine which aggregates results from more
|
||||||
than 70 search services. Users are neither tracked nor profiled. Additionally,
|
than 70 search services. Users are neither tracked nor profiled. Additionally,
|
||||||
searx can be used over Tor for online anonymity.
|
searx can be used over Tor for online anonymity.
|
||||||
|
|
||||||
Get started with searx by using one of the Searx-instances_. If you don't trust
|
|
||||||
anyone, you can set up your own, see :ref:`installation`.
|
|
||||||
|
|
||||||
.. sidebar:: Features
|
.. sidebar:: Features
|
||||||
|
|
||||||
- Self hosted
|
- Self hosted
|
||||||
|
@ -33,5 +30,3 @@ anyone, you can set up your own, see :ref:`installation`.
|
||||||
searx_extra/index
|
searx_extra/index
|
||||||
utils/index
|
utils/index
|
||||||
blog/index
|
blog/index
|
||||||
|
|
||||||
.. _Searx-instances: https://searx.space
|
|
||||||
|
|
|
@ -17,7 +17,7 @@ Prefix: ``:``
|
||||||
Prefix: ``?``
|
Prefix: ``?``
|
||||||
to add engines and categories to the currently selected categories
|
to add engines and categories to the currently selected categories
|
||||||
|
|
||||||
Abbrevations of the engines and languages are also accepted. Engine/category
|
Abbreviations of the engines and languages are also accepted. Engine/category
|
||||||
modifiers are chainable and inclusive (e.g. with :search:`!it !ddg !wp qwer
|
modifiers are chainable and inclusive (e.g. with :search:`!it !ddg !wp qwer
|
||||||
<?q=%21it%20%21ddg%20%21wp%20qwer>` search in IT category **and** duckduckgo
|
<?q=%21it%20%21ddg%20%21wp%20qwer>` search in IT category **and** duckduckgo
|
||||||
**and** wikipedia for ``qwer``).
|
**and** wikipedia for ``qwer``).
|
||||||
|
|
9
manage
9
manage
|
@ -188,13 +188,11 @@ docker.build() {
|
||||||
die 1 "there is no remote origin"
|
die 1 "there is no remote origin"
|
||||||
fi
|
fi
|
||||||
|
|
||||||
# This is a git repository
|
|
||||||
|
|
||||||
# "git describe" to get the Docker version (for example : v0.15.0-89-g0585788e)
|
# "git describe" to get the Docker version (for example : v0.15.0-89-g0585788e)
|
||||||
# awk to remove the "v" and the "g"
|
# awk to remove the "v" and the "g"
|
||||||
SEARX_GIT_VERSION=$(git describe --match "v[0-9]*\.[0-9]*\.[0-9]*" HEAD 2>/dev/null | awk -F'-' '{OFS="-"; $1=substr($1, 2); if ($3) { $3=substr($3, 2); } print}')
|
SEARX_GIT_VERSION=$(git describe --tags | awk -F'-' '{OFS="-"; $1=substr($1, 2); if ($3) { $3=substr($3, 2); } print}')
|
||||||
|
|
||||||
# add the suffix "-dirty" if the repository has uncommited change
|
# add the suffix "-dirty" if the repository has uncommitted change
|
||||||
# /!\ HACK for searx/searx: ignore utils/brand.env
|
# /!\ HACK for searx/searx: ignore utils/brand.env
|
||||||
git update-index -q --refresh
|
git update-index -q --refresh
|
||||||
if [ ! -z "$(git diff-index --name-only HEAD -- | grep -v 'utils/brand.env')" ]; then
|
if [ ! -z "$(git diff-index --name-only HEAD -- | grep -v 'utils/brand.env')" ]; then
|
||||||
|
@ -286,9 +284,6 @@ node.env() {
|
||||||
which npm &> /dev/null || die 1 'node.env - npm is not found!'
|
which npm &> /dev/null || die 1 'node.env - npm is not found!'
|
||||||
|
|
||||||
( set -e
|
( set -e
|
||||||
# shellcheck disable=SC2030
|
|
||||||
PATH="$(npm bin):$PATH"
|
|
||||||
export PATH
|
|
||||||
|
|
||||||
build_msg INSTALL "npm install $NPM_PACKAGES"
|
build_msg INSTALL "npm install $NPM_PACKAGES"
|
||||||
# shellcheck disable=SC2086
|
# shellcheck disable=SC2086
|
||||||
|
|
|
@ -1,19 +1,19 @@
|
||||||
mock==4.0.3
|
mock==5.0.1
|
||||||
nose2[coverage_plugin]==0.12.0
|
nose2[coverage_plugin]==0.12.0
|
||||||
cov-core==1.15.0
|
cov-core==1.15.0
|
||||||
pycodestyle==2.9.1
|
pycodestyle==2.10.0
|
||||||
pylint==2.14.5
|
pylint==2.15.9
|
||||||
splinter==0.18.1
|
splinter==0.19.0
|
||||||
transifex-client==0.14.3; python_version < '3.10'
|
transifex-client==0.14.3; python_version < '3.10'
|
||||||
transifex-client==0.12.5; python_version == '3.10'
|
transifex-client==0.12.5; python_version == '3.10'
|
||||||
selenium==4.3.0
|
selenium==4.8.3
|
||||||
twine==4.0.1
|
twine==4.0.2
|
||||||
Pallets-Sphinx-Themes==2.0.2
|
Pallets-Sphinx-Themes==2.0.3
|
||||||
docutils==0.18
|
docutils==0.18
|
||||||
Sphinx==5.1.1
|
Sphinx==5.3.0
|
||||||
sphinx-issues==3.0.1
|
sphinx-issues==3.0.1
|
||||||
sphinx-jinja==2.0.2
|
sphinx-jinja==2.0.2
|
||||||
sphinx-tabs==3.4.1
|
sphinx-tabs==3.4.1
|
||||||
sphinxcontrib-programoutput==0.17
|
sphinxcontrib-programoutput==0.17
|
||||||
sphinx-autobuild==2021.3.14
|
sphinx-autobuild==2021.3.14
|
||||||
linuxdoc==20211220
|
linuxdoc==20221127
|
||||||
|
|
|
@ -1,13 +1,13 @@
|
||||||
Brotli==1.0.9
|
Brotli==1.0.9
|
||||||
babel==2.10.3
|
babel==2.11.0
|
||||||
certifi==2022.6.15
|
certifi==2022.12.7
|
||||||
flask-babel==2.0.0
|
flask-babel==2.0.0
|
||||||
flask==2.2.1
|
flask==2.2.2
|
||||||
jinja2==3.1.2
|
jinja2==3.1.2
|
||||||
langdetect==1.0.9
|
langdetect==1.0.9
|
||||||
lxml==4.9.1
|
lxml==4.9.2
|
||||||
pygments==2.12.0
|
pygments==2.12.0
|
||||||
python-dateutil==2.8.2
|
python-dateutil==2.8.2
|
||||||
pyyaml==6.0
|
pyyaml==6.0
|
||||||
requests[socks]==2.28.1
|
requests[socks]==2.28.2
|
||||||
setproctitle==1.3.1
|
setproctitle==1.3.2
|
||||||
|
|
|
@ -41,11 +41,11 @@ if settings['ui']['static_path']:
|
||||||
|
|
||||||
'''
|
'''
|
||||||
enable debug if
|
enable debug if
|
||||||
the environnement variable SEARX_DEBUG is 1 or true
|
the environment variable SEARX_DEBUG is 1 or true
|
||||||
(whatever the value in settings.yml)
|
(whatever the value in settings.yml)
|
||||||
or general.debug=True in settings.yml
|
or general.debug=True in settings.yml
|
||||||
disable debug if
|
disable debug if
|
||||||
the environnement variable SEARX_DEBUG is 0 or false
|
the environment variable SEARX_DEBUG is 0 or false
|
||||||
(whatever the value in settings.yml)
|
(whatever the value in settings.yml)
|
||||||
or general.debug=False in settings.yml
|
or general.debug=False in settings.yml
|
||||||
'''
|
'''
|
||||||
|
|
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
File diff suppressed because it is too large
Load Diff
|
@ -1,8 +1,9 @@
|
||||||
{
|
{
|
||||||
"versions": [
|
"versions": [
|
||||||
"102.0",
|
"111.0.1",
|
||||||
"101.0.1",
|
"111.0",
|
||||||
"101.0"
|
"110.0.1",
|
||||||
|
"110.0"
|
||||||
],
|
],
|
||||||
"os": [
|
"os": [
|
||||||
"Windows NT 10.0; WOW64",
|
"Windows NT 10.0; WOW64",
|
||||||
|
|
|
@ -153,7 +153,6 @@
|
||||||
"Q107164998": "cd mm²/m²",
|
"Q107164998": "cd mm²/m²",
|
||||||
"Q107210119": "g/s",
|
"Q107210119": "g/s",
|
||||||
"Q107210344": "mg/s",
|
"Q107210344": "mg/s",
|
||||||
"Q107213614": "kJ/100g",
|
|
||||||
"Q107226391": "cm⁻¹",
|
"Q107226391": "cm⁻¹",
|
||||||
"Q1072404": "K",
|
"Q1072404": "K",
|
||||||
"Q107244316": "mm⁻¹",
|
"Q107244316": "mm⁻¹",
|
||||||
|
@ -208,16 +207,38 @@
|
||||||
"Q1091257": "tex",
|
"Q1091257": "tex",
|
||||||
"Q1092296": "a",
|
"Q1092296": "a",
|
||||||
"Q110143852": "Ω cm",
|
"Q110143852": "Ω cm",
|
||||||
|
"Q110143896": "cm³/g",
|
||||||
"Q1104069": "$",
|
"Q1104069": "$",
|
||||||
"Q11061003": "μm²",
|
"Q11061003": "μm²",
|
||||||
"Q11061005": "nm²",
|
"Q11061005": "nm²",
|
||||||
"Q110742003": "dppx",
|
"Q110742003": "dppx",
|
||||||
"Q1131660": "st",
|
"Q1131660": "st",
|
||||||
"Q1137675": "cr",
|
"Q1137675": "cr",
|
||||||
|
"Q114002440": "𒄀",
|
||||||
|
"Q114002534": "𒃻",
|
||||||
|
"Q114002639": "𒈨𒊑",
|
||||||
|
"Q114002796": "𒂆",
|
||||||
|
"Q114002930": "𒀺",
|
||||||
|
"Q114002955": "𒀹𒃷",
|
||||||
|
"Q114002974": "𒃷",
|
||||||
"Q1140444": "Zb",
|
"Q1140444": "Zb",
|
||||||
"Q1140577": "Yb",
|
"Q1140577": "Yb",
|
||||||
|
"Q114589269": "A",
|
||||||
"Q1152074": "Pb",
|
"Q1152074": "Pb",
|
||||||
"Q1152323": "Tb",
|
"Q1152323": "Tb",
|
||||||
|
"Q115277430": "QB",
|
||||||
|
"Q115280832": "RB",
|
||||||
|
"Q115359862": "qg",
|
||||||
|
"Q115359863": "rg",
|
||||||
|
"Q115359865": "Rg",
|
||||||
|
"Q115359866": "Qg",
|
||||||
|
"Q115359910": "Rm",
|
||||||
|
"Q115533751": "rm",
|
||||||
|
"Q115533764": "qm",
|
||||||
|
"Q115533776": "Qm",
|
||||||
|
"Q116432446": "ᵐ",
|
||||||
|
"Q116432563": "ˢ",
|
||||||
|
"Q116443090": "ʰ",
|
||||||
"Q1165799": "mil",
|
"Q1165799": "mil",
|
||||||
"Q11776930": "Mg",
|
"Q11776930": "Mg",
|
||||||
"Q11830636": "psf",
|
"Q11830636": "psf",
|
||||||
|
@ -236,12 +257,14 @@
|
||||||
"Q12257695": "Eb/s",
|
"Q12257695": "Eb/s",
|
||||||
"Q12257696": "EB/s",
|
"Q12257696": "EB/s",
|
||||||
"Q12261466": "kB/s",
|
"Q12261466": "kB/s",
|
||||||
|
"Q12263659": "mgal",
|
||||||
"Q12265780": "Pb/s",
|
"Q12265780": "Pb/s",
|
||||||
"Q12265783": "PB/s",
|
"Q12265783": "PB/s",
|
||||||
"Q12269121": "Yb/s",
|
"Q12269121": "Yb/s",
|
||||||
"Q12269122": "YB/s",
|
"Q12269122": "YB/s",
|
||||||
"Q12269308": "Zb/s",
|
"Q12269308": "Zb/s",
|
||||||
"Q12269309": "ZB/s",
|
"Q12269309": "ZB/s",
|
||||||
|
"Q1238720": "vols.",
|
||||||
"Q1247300": "cm H₂O",
|
"Q1247300": "cm H₂O",
|
||||||
"Q12714022": "cwt",
|
"Q12714022": "cwt",
|
||||||
"Q12789864": "GeV",
|
"Q12789864": "GeV",
|
||||||
|
@ -282,7 +305,6 @@
|
||||||
"Q14914907": "th",
|
"Q14914907": "th",
|
||||||
"Q14916719": "Gpc",
|
"Q14916719": "Gpc",
|
||||||
"Q14923662": "Pm³",
|
"Q14923662": "Pm³",
|
||||||
"Q1511773": "LSd",
|
|
||||||
"Q15120301": "l atm",
|
"Q15120301": "l atm",
|
||||||
"Q1542309": "xu",
|
"Q1542309": "xu",
|
||||||
"Q1545979": "ft³",
|
"Q1545979": "ft³",
|
||||||
|
@ -304,7 +326,6 @@
|
||||||
"Q17255465": "v_P",
|
"Q17255465": "v_P",
|
||||||
"Q173117": "R$",
|
"Q173117": "R$",
|
||||||
"Q1741429": "kpm",
|
"Q1741429": "kpm",
|
||||||
"Q174467": "Lm",
|
|
||||||
"Q174728": "cm",
|
"Q174728": "cm",
|
||||||
"Q174789": "mm",
|
"Q174789": "mm",
|
||||||
"Q175821": "μm",
|
"Q175821": "μm",
|
||||||
|
@ -328,13 +349,11 @@
|
||||||
"Q182429": "m/s",
|
"Q182429": "m/s",
|
||||||
"Q1826195": "dl",
|
"Q1826195": "dl",
|
||||||
"Q18413919": "cm/s",
|
"Q18413919": "cm/s",
|
||||||
"Q184172": "F",
|
|
||||||
"Q185078": "a",
|
"Q185078": "a",
|
||||||
"Q185153": "erg",
|
"Q185153": "erg",
|
||||||
"Q185648": "Torr",
|
"Q185648": "Torr",
|
||||||
"Q185759": "span",
|
"Q185759": "span",
|
||||||
"Q1872619": "zs",
|
"Q1872619": "zs",
|
||||||
"Q189097": "₧",
|
|
||||||
"Q190095": "Gy",
|
"Q190095": "Gy",
|
||||||
"Q19017495": "mm²",
|
"Q19017495": "mm²",
|
||||||
"Q190951": "S$",
|
"Q190951": "S$",
|
||||||
|
@ -350,6 +369,7 @@
|
||||||
"Q194339": "B$",
|
"Q194339": "B$",
|
||||||
"Q1970718": "mam",
|
"Q1970718": "mam",
|
||||||
"Q1972579": "pdl",
|
"Q1972579": "pdl",
|
||||||
|
"Q19877834": "cd-ft",
|
||||||
"Q199462": "LE",
|
"Q199462": "LE",
|
||||||
"Q199471": "Afs",
|
"Q199471": "Afs",
|
||||||
"Q200323": "dm",
|
"Q200323": "dm",
|
||||||
|
@ -388,7 +408,7 @@
|
||||||
"Q211256": "mi/h",
|
"Q211256": "mi/h",
|
||||||
"Q21154419": "PD",
|
"Q21154419": "PD",
|
||||||
"Q211580": "BTU (th)",
|
"Q211580": "BTU (th)",
|
||||||
"Q212120": "A h",
|
"Q212120": "A⋅h",
|
||||||
"Q213005": "G$",
|
"Q213005": "G$",
|
||||||
"Q2140397": "in³",
|
"Q2140397": "in³",
|
||||||
"Q214377": "ell",
|
"Q214377": "ell",
|
||||||
|
@ -428,7 +448,6 @@
|
||||||
"Q23931040": "dam²",
|
"Q23931040": "dam²",
|
||||||
"Q23931103": "nmi²",
|
"Q23931103": "nmi²",
|
||||||
"Q240468": "syr£",
|
"Q240468": "syr£",
|
||||||
"Q2414435": "$b.",
|
|
||||||
"Q242988": "Lib$",
|
"Q242988": "Lib$",
|
||||||
"Q2438073": "ag",
|
"Q2438073": "ag",
|
||||||
"Q2448803": "mV",
|
"Q2448803": "mV",
|
||||||
|
@ -546,7 +565,7 @@
|
||||||
"Q3773454": "Mpc",
|
"Q3773454": "Mpc",
|
||||||
"Q3815076": "Kib",
|
"Q3815076": "Kib",
|
||||||
"Q3833309": "£",
|
"Q3833309": "£",
|
||||||
"Q3858002": "mA h",
|
"Q3858002": "mA⋅h",
|
||||||
"Q3867152": "ft/s²",
|
"Q3867152": "ft/s²",
|
||||||
"Q389062": "Tib",
|
"Q389062": "Tib",
|
||||||
"Q3902688": "pl",
|
"Q3902688": "pl",
|
||||||
|
@ -607,6 +626,8 @@
|
||||||
"Q53393868": "GJ",
|
"Q53393868": "GJ",
|
||||||
"Q53393886": "PJ",
|
"Q53393886": "PJ",
|
||||||
"Q53393890": "EJ",
|
"Q53393890": "EJ",
|
||||||
|
"Q53393893": "ZJ",
|
||||||
|
"Q53393898": "YJ",
|
||||||
"Q53448786": "yHz",
|
"Q53448786": "yHz",
|
||||||
"Q53448790": "zHz",
|
"Q53448790": "zHz",
|
||||||
"Q53448794": "fHz",
|
"Q53448794": "fHz",
|
||||||
|
@ -620,6 +641,7 @@
|
||||||
"Q53448826": "hHz",
|
"Q53448826": "hHz",
|
||||||
"Q53448828": "yJ",
|
"Q53448828": "yJ",
|
||||||
"Q53448832": "zJ",
|
"Q53448832": "zJ",
|
||||||
|
"Q53448835": "fJ",
|
||||||
"Q53448842": "pJ",
|
"Q53448842": "pJ",
|
||||||
"Q53448844": "nJ",
|
"Q53448844": "nJ",
|
||||||
"Q53448847": "μJ",
|
"Q53448847": "μJ",
|
||||||
|
@ -682,6 +704,7 @@
|
||||||
"Q53951982": "Mt",
|
"Q53951982": "Mt",
|
||||||
"Q53952048": "kt",
|
"Q53952048": "kt",
|
||||||
"Q54006645": "ZWb",
|
"Q54006645": "ZWb",
|
||||||
|
"Q54081354": "ZT",
|
||||||
"Q54081925": "ZSv",
|
"Q54081925": "ZSv",
|
||||||
"Q54082468": "ZS",
|
"Q54082468": "ZS",
|
||||||
"Q54083144": "ZΩ",
|
"Q54083144": "ZΩ",
|
||||||
|
@ -706,8 +729,6 @@
|
||||||
"Q56157046": "nmol",
|
"Q56157046": "nmol",
|
||||||
"Q56157048": "pmol",
|
"Q56157048": "pmol",
|
||||||
"Q56160603": "fmol",
|
"Q56160603": "fmol",
|
||||||
"Q56302633": "UM",
|
|
||||||
"Q56317116": "mgal",
|
|
||||||
"Q56317622": "Q_P",
|
"Q56317622": "Q_P",
|
||||||
"Q56318907": "kbar",
|
"Q56318907": "kbar",
|
||||||
"Q56349362": "Bs.S",
|
"Q56349362": "Bs.S",
|
||||||
|
@ -1184,10 +1205,10 @@
|
||||||
"Q11570": "kg",
|
"Q11570": "kg",
|
||||||
"Q11573": "m",
|
"Q11573": "m",
|
||||||
"Q11574": "s",
|
"Q11574": "s",
|
||||||
|
"Q11579": "K",
|
||||||
"Q11582": "L",
|
"Q11582": "L",
|
||||||
"Q12129": "pc",
|
"Q12129": "pc",
|
||||||
"Q12438": "N",
|
"Q12438": "N",
|
||||||
"Q16068": "DM",
|
|
||||||
"Q1811": "AU",
|
"Q1811": "AU",
|
||||||
"Q20764": "Ma",
|
"Q20764": "Ma",
|
||||||
"Q2101": "e",
|
"Q2101": "e",
|
||||||
|
|
|
@ -142,7 +142,7 @@ def load_engine(engine_data):
|
||||||
|
|
||||||
engine.stats = {
|
engine.stats = {
|
||||||
'sent_search_count': 0, # sent search
|
'sent_search_count': 0, # sent search
|
||||||
'search_count': 0, # succesful search
|
'search_count': 0, # successful search
|
||||||
'result_count': 0,
|
'result_count': 0,
|
||||||
'engine_time': 0,
|
'engine_time': 0,
|
||||||
'engine_time_count': 0,
|
'engine_time_count': 0,
|
||||||
|
@ -171,7 +171,7 @@ def load_engine(engine_data):
|
||||||
categories.setdefault(category_name, []).append(engine)
|
categories.setdefault(category_name, []).append(engine)
|
||||||
|
|
||||||
if engine.shortcut in engine_shortcuts:
|
if engine.shortcut in engine_shortcuts:
|
||||||
logger.error('Engine config error: ambigious shortcut: {0}'.format(engine.shortcut))
|
logger.error('Engine config error: ambiguous shortcut: {0}'.format(engine.shortcut))
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
engine_shortcuts[engine.shortcut] = engine.name
|
engine_shortcuts[engine.shortcut] = engine.name
|
||||||
|
|
|
@ -52,8 +52,7 @@ def request(query, params):
|
||||||
offset=offset)
|
offset=offset)
|
||||||
|
|
||||||
params['url'] = base_url + search_path
|
params['url'] = base_url + search_path
|
||||||
params['headers']['User-Agent'] = ('Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 '
|
params['headers']['Accept'] = 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8'
|
||||||
'(KHTML, like Gecko) Chrome/97.0.4692.71 Safari/537.36')
|
|
||||||
|
|
||||||
return params
|
return params
|
||||||
|
|
||||||
|
@ -68,11 +67,13 @@ def response(resp):
|
||||||
for result in eval_xpath(dom, '//div[@class="sa_cc"]'):
|
for result in eval_xpath(dom, '//div[@class="sa_cc"]'):
|
||||||
link = eval_xpath(result, './/h3/a')[0]
|
link = eval_xpath(result, './/h3/a')[0]
|
||||||
url = link.attrib.get('href')
|
url = link.attrib.get('href')
|
||||||
|
pretty_url = extract_text(eval_xpath(result, './/cite'))
|
||||||
title = extract_text(link)
|
title = extract_text(link)
|
||||||
content = extract_text(eval_xpath(result, './/p'))
|
content = extract_text(eval_xpath(result, './/p'))
|
||||||
|
|
||||||
# append result
|
# append result
|
||||||
results.append({'url': url,
|
results.append({'url': url,
|
||||||
|
'pretty_url': pretty_url,
|
||||||
'title': title,
|
'title': title,
|
||||||
'content': content})
|
'content': content})
|
||||||
|
|
||||||
|
@ -80,11 +81,13 @@ def response(resp):
|
||||||
for result in eval_xpath(dom, '//li[@class="b_algo"]'):
|
for result in eval_xpath(dom, '//li[@class="b_algo"]'):
|
||||||
link = eval_xpath(result, './/h2/a')[0]
|
link = eval_xpath(result, './/h2/a')[0]
|
||||||
url = link.attrib.get('href')
|
url = link.attrib.get('href')
|
||||||
|
pretty_url = extract_text(eval_xpath(result, './/cite'))
|
||||||
title = extract_text(link)
|
title = extract_text(link)
|
||||||
content = extract_text(eval_xpath(result, './/p'))
|
content = extract_text(eval_xpath(result, './/p'))
|
||||||
|
|
||||||
# append result
|
# append result
|
||||||
results.append({'url': url,
|
results.append({'url': url,
|
||||||
|
'pretty_url': pretty_url,
|
||||||
'title': title,
|
'title': title,
|
||||||
'content': content})
|
'content': content})
|
||||||
|
|
||||||
|
|
|
@ -70,7 +70,7 @@ def request(query, params):
|
||||||
if params['time_range'] in time_range_dict:
|
if params['time_range'] in time_range_dict:
|
||||||
params['url'] += time_range_string.format(interval=time_range_dict[params['time_range']])
|
params['url'] += time_range_string.format(interval=time_range_dict[params['time_range']])
|
||||||
|
|
||||||
# bing videos did not like "older" versions < 70.0.1 when selectin other
|
# bing videos did not like "older" versions < 70.0.1 when selecting other
|
||||||
# languages then 'en' .. very strange ?!?!
|
# languages then 'en' .. very strange ?!?!
|
||||||
params['headers']['User-Agent'] = 'Mozilla/5.0 (X11; Linux x86_64; rv:73.0.1) Gecko/20100101 Firefox/73.0.1'
|
params['headers']['User-Agent'] = 'Mozilla/5.0 (X11; Linux x86_64; rv:73.0.1) Gecko/20100101 Firefox/73.0.1'
|
||||||
|
|
||||||
|
|
|
@ -18,7 +18,7 @@ from searx.poolrequests import get
|
||||||
|
|
||||||
# about
|
# about
|
||||||
about = {
|
about = {
|
||||||
"website": 'https://lite.duckduckgo.com/lite',
|
"website": 'https://lite.duckduckgo.com/lite/',
|
||||||
"wikidata_id": 'Q12805',
|
"wikidata_id": 'Q12805',
|
||||||
"official_api_documentation": 'https://duckduckgo.com/api',
|
"official_api_documentation": 'https://duckduckgo.com/api',
|
||||||
"use_official_api": False,
|
"use_official_api": False,
|
||||||
|
@ -45,7 +45,7 @@ language_aliases = {
|
||||||
time_range_dict = {'day': 'd', 'week': 'w', 'month': 'm', 'year': 'y'}
|
time_range_dict = {'day': 'd', 'week': 'w', 'month': 'm', 'year': 'y'}
|
||||||
|
|
||||||
# search-url
|
# search-url
|
||||||
url = 'https://lite.duckduckgo.com/lite'
|
url = 'https://lite.duckduckgo.com/lite/'
|
||||||
url_ping = 'https://duckduckgo.com/t/sl_l'
|
url_ping = 'https://duckduckgo.com/t/sl_l'
|
||||||
|
|
||||||
|
|
||||||
|
@ -73,6 +73,9 @@ def request(query, params):
|
||||||
# link again and again ..
|
# link again and again ..
|
||||||
|
|
||||||
params['headers']['Content-Type'] = 'application/x-www-form-urlencoded'
|
params['headers']['Content-Type'] = 'application/x-www-form-urlencoded'
|
||||||
|
params['headers']['Origin'] = 'https://lite.duckduckgo.com'
|
||||||
|
params['headers']['Referer'] = 'https://lite.duckduckgo.com/'
|
||||||
|
params['headers']['User-Agent'] = 'Mozilla/5.0'
|
||||||
|
|
||||||
# initial page does not have an offset
|
# initial page does not have an offset
|
||||||
if params['pageno'] == 2:
|
if params['pageno'] == 2:
|
||||||
|
|
|
@ -80,7 +80,7 @@ def response(resp):
|
||||||
# * book / performing art / film / television / media franchise / concert tour / playwright
|
# * book / performing art / film / television / media franchise / concert tour / playwright
|
||||||
# * prepared food
|
# * prepared food
|
||||||
# * website / software / os / programming language / file format / software engineer
|
# * website / software / os / programming language / file format / software engineer
|
||||||
# * compagny
|
# * company
|
||||||
|
|
||||||
content = ''
|
content = ''
|
||||||
heading = search_res.get('Heading', '')
|
heading = search_res.get('Heading', '')
|
||||||
|
|
|
@ -40,7 +40,7 @@ def response(resp):
|
||||||
|
|
||||||
search_res = loads(resp.text)
|
search_res = loads(resp.text)
|
||||||
|
|
||||||
# check if items are recieved
|
# check if items are received
|
||||||
if 'items' not in search_res:
|
if 'items' not in search_res:
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|
|
@ -109,22 +109,15 @@ filter_mapping = {
|
||||||
# specific xpath variables
|
# specific xpath variables
|
||||||
# ------------------------
|
# ------------------------
|
||||||
|
|
||||||
# google results are grouped into <div class="jtfYYd ..." ../>
|
results_xpath = '//div[contains(@class, "MjjYud")]'
|
||||||
results_xpath = '//div[contains(@class, "jtfYYd")]'
|
title_xpath = './/h3[1]'
|
||||||
|
href_xpath = './/a/@href'
|
||||||
|
content_xpath = './/div[@data-sncf]'
|
||||||
results_xpath_mobile_ui = '//div[contains(@class, "g ")]'
|
results_xpath_mobile_ui = '//div[contains(@class, "g ")]'
|
||||||
|
|
||||||
# google *sections* are no usual *results*, we ignore them
|
# google *sections* are no usual *results*, we ignore them
|
||||||
g_section_with_header = './g-section-with-header'
|
g_section_with_header = './g-section-with-header'
|
||||||
|
|
||||||
# the title is a h3 tag relative to the result group
|
|
||||||
title_xpath = './/h3[1]'
|
|
||||||
|
|
||||||
# in the result group there is <div class="yuRUbf" ../> it's first child is a <a
|
|
||||||
# href=...>
|
|
||||||
href_xpath = './/div[@class="yuRUbf"]//a/@href'
|
|
||||||
|
|
||||||
# in the result group there is <div class="VwiC3b ..." ../> containing the *content*
|
|
||||||
content_xpath = './/div[contains(@class, "VwiC3b")]'
|
|
||||||
|
|
||||||
# Suggestions are links placed in a *card-section*, we extract only the text
|
# Suggestions are links placed in a *card-section*, we extract only the text
|
||||||
# from the links not the links itself.
|
# from the links not the links itself.
|
||||||
|
@ -213,7 +206,8 @@ def request(query, params):
|
||||||
additional_parameters = {}
|
additional_parameters = {}
|
||||||
if use_mobile_ui:
|
if use_mobile_ui:
|
||||||
additional_parameters = {
|
additional_parameters = {
|
||||||
'async': 'use_ac:true,_fmt:pc',
|
'asearch': 'arc',
|
||||||
|
'async': 'use_ac:true,_fmt:html',
|
||||||
}
|
}
|
||||||
|
|
||||||
# https://www.google.de/search?q=corona&hl=de&lr=lang_de&start=0&tbs=qdr%3Ad&safe=medium
|
# https://www.google.de/search?q=corona&hl=de&lr=lang_de&start=0&tbs=qdr%3Ad&safe=medium
|
||||||
|
@ -288,7 +282,7 @@ def response(resp):
|
||||||
|
|
||||||
# google *sections*
|
# google *sections*
|
||||||
if extract_text(eval_xpath(result, g_section_with_header)):
|
if extract_text(eval_xpath(result, g_section_with_header)):
|
||||||
logger.debug("ingoring <g-section-with-header>")
|
logger.debug("ignoring <g-section-with-header>")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
"""Google (News)
|
"""Google (News)
|
||||||
|
|
||||||
For detailed description of the *REST-full* API see: `Query Parameter
|
For detailed description of the *REST-full* API see: `Query Parameter
|
||||||
Definitions`_. Not all parameters can be appied:
|
Definitions`_. Not all parameters can be applied:
|
||||||
|
|
||||||
- num_ : the number of search results is ignored
|
- num_ : the number of search results is ignored
|
||||||
- save_ : is ignored / Google-News results are always *SafeSearch*
|
- save_ : is ignored / Google-News results are always *SafeSearch*
|
||||||
|
@ -155,7 +155,7 @@ def response(resp):
|
||||||
padding = (4 -(len(jslog) % 4)) * "="
|
padding = (4 -(len(jslog) % 4)) * "="
|
||||||
jslog = b64decode(jslog + padding)
|
jslog = b64decode(jslog + padding)
|
||||||
except binascii.Error:
|
except binascii.Error:
|
||||||
# URL cant be read, skip this result
|
# URL can't be read, skip this result
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# now we have : b'[null, ... null,"https://www.cnn.com/.../index.html"]'
|
# now we have : b'[null, ... null,"https://www.cnn.com/.../index.html"]'
|
||||||
|
|
|
@ -2,7 +2,7 @@
|
||||||
"""Google (Video)
|
"""Google (Video)
|
||||||
|
|
||||||
For detailed description of the *REST-full* API see: `Query Parameter
|
For detailed description of the *REST-full* API see: `Query Parameter
|
||||||
Definitions`_. Not all parameters can be appied.
|
Definitions`_. Not all parameters can be applied.
|
||||||
|
|
||||||
.. _admonition:: Content-Security-Policy (CSP)
|
.. _admonition:: Content-Security-Policy (CSP)
|
||||||
|
|
||||||
|
@ -163,7 +163,7 @@ def response(resp):
|
||||||
|
|
||||||
# google *sections*
|
# google *sections*
|
||||||
if extract_text(eval_xpath(result, g_section_with_header)):
|
if extract_text(eval_xpath(result, g_section_with_header)):
|
||||||
logger.debug("ingoring <g-section-with-header>")
|
logger.debug("ignoring <g-section-with-header>")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
title = extract_text(eval_xpath_getindex(result, title_xpath, 0))
|
title = extract_text(eval_xpath_getindex(result, title_xpath, 0))
|
||||||
|
|
|
@ -0,0 +1,93 @@
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
# config
|
||||||
|
categories = ['general', 'images', 'music', 'videos']
|
||||||
|
paging = True
|
||||||
|
time_range_support = True
|
||||||
|
|
||||||
|
# url
|
||||||
|
base_url = 'https://api.ipfs-search.com/v1/'
|
||||||
|
search_string = 'search?{query} first-seen:{time_range} metadata.Content-Type:({mime_type})&page={page} '
|
||||||
|
|
||||||
|
|
||||||
|
mime_types_map = {
|
||||||
|
'general': "*",
|
||||||
|
'images': 'image*',
|
||||||
|
'music': 'audio*',
|
||||||
|
'videos': 'video*'
|
||||||
|
}
|
||||||
|
|
||||||
|
time_range_map = {'day': '[ now-24h\/h TO *]',
|
||||||
|
'week': '[ now\/h-7d TO *]',
|
||||||
|
'month': '[ now\/d-30d TO *]',
|
||||||
|
'year': '[ now\/d-1y TO *]'}
|
||||||
|
|
||||||
|
ipfs_url = 'https://gateway.ipfs.io/ipfs/{hash}'
|
||||||
|
|
||||||
|
|
||||||
|
def request(query, params):
|
||||||
|
mime_type = mime_types_map.get(params['category'], '*')
|
||||||
|
time_range = time_range_map.get(params['time_range'], '*')
|
||||||
|
search_path = search_string.format(
|
||||||
|
query=urlencode({'q': query}),
|
||||||
|
time_range=time_range,
|
||||||
|
page=params['pageno'],
|
||||||
|
mime_type=mime_type)
|
||||||
|
|
||||||
|
params['url'] = base_url + search_path
|
||||||
|
|
||||||
|
return params
|
||||||
|
|
||||||
|
|
||||||
|
def clean_html(text):
|
||||||
|
if not text:
|
||||||
|
return ""
|
||||||
|
return str(re.sub(re.compile('<.*?>'), '', text))
|
||||||
|
|
||||||
|
|
||||||
|
def create_base_result(record):
|
||||||
|
url = ipfs_url.format(hash=record.get('hash'))
|
||||||
|
title = clean_html(record.get('title'))
|
||||||
|
published_date = datetime.strptime(record.get('first-seen'), '%Y-%m-%dT%H:%M:%SZ')
|
||||||
|
return {'url': url,
|
||||||
|
'title': title,
|
||||||
|
'publishedDate': published_date}
|
||||||
|
|
||||||
|
|
||||||
|
def create_text_result(record):
|
||||||
|
result = create_base_result(record)
|
||||||
|
description = clean_html(record.get('description'))
|
||||||
|
result['description'] = description
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def create_image_result(record):
|
||||||
|
result = create_base_result(record)
|
||||||
|
result['img_src'] = result['url']
|
||||||
|
result['template'] = 'images.html'
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def create_video_result(record):
|
||||||
|
result = create_base_result(record)
|
||||||
|
result['thumbnail'] = ''
|
||||||
|
result['template'] = 'videos.html'
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def response(resp):
|
||||||
|
api_results = json.loads(resp.text)
|
||||||
|
results = []
|
||||||
|
for result in api_results.get('hits', []):
|
||||||
|
mime_type = result.get('mimetype', 'text/plain')
|
||||||
|
|
||||||
|
if mime_type.startswith('image'):
|
||||||
|
results.append(create_image_result(result))
|
||||||
|
elif mime_type.startswith('video'):
|
||||||
|
results.append(create_video_result(result))
|
||||||
|
else:
|
||||||
|
results.append(create_text_result(result))
|
||||||
|
return results
|
|
@ -0,0 +1,55 @@
|
||||||
|
# SPDX-License-Identifier: AGPL-3.0-or-later
|
||||||
|
"""
|
||||||
|
Omnom (General)
|
||||||
|
"""
|
||||||
|
|
||||||
|
from json import loads
|
||||||
|
from urllib.parse import urlencode
|
||||||
|
|
||||||
|
# about
|
||||||
|
about = {
|
||||||
|
"website": 'https://github.com/asciimoo/omnom',
|
||||||
|
"wikidata_id": None,
|
||||||
|
"official_api_documentation": 'http://your.omnom.host/api',
|
||||||
|
"use_official_api": True,
|
||||||
|
"require_api_key": False,
|
||||||
|
"results": 'JSON',
|
||||||
|
}
|
||||||
|
|
||||||
|
# engine dependent config
|
||||||
|
categories = ['general']
|
||||||
|
paging = True
|
||||||
|
|
||||||
|
# search-url
|
||||||
|
base_url = None
|
||||||
|
search_path = 'bookmarks?{query}&pageno={pageno}&format=json'
|
||||||
|
bookmark_path = 'bookmark?id='
|
||||||
|
|
||||||
|
|
||||||
|
# do search-request
|
||||||
|
def request(query, params):
|
||||||
|
params['url'] = base_url +\
|
||||||
|
search_path.format(query=urlencode({'query': query}),
|
||||||
|
pageno=params['pageno'])
|
||||||
|
|
||||||
|
return params
|
||||||
|
|
||||||
|
|
||||||
|
# get response from search-request
|
||||||
|
def response(resp):
|
||||||
|
results = []
|
||||||
|
json = loads(resp.text)
|
||||||
|
|
||||||
|
# parse results
|
||||||
|
for r in json.get('Bookmarks', {}):
|
||||||
|
content = r['url']
|
||||||
|
if r.get('notes'):
|
||||||
|
content += ' - ' + r['notes']
|
||||||
|
results.append({
|
||||||
|
'title': r['title'],
|
||||||
|
'content': content,
|
||||||
|
'url': base_url + bookmark_path + str(r['id']),
|
||||||
|
})
|
||||||
|
|
||||||
|
# return results
|
||||||
|
return results
|
|
@ -72,7 +72,7 @@ def response(resp):
|
||||||
elif properties.get('osm_type') == 'R':
|
elif properties.get('osm_type') == 'R':
|
||||||
osm_type = 'relation'
|
osm_type = 'relation'
|
||||||
else:
|
else:
|
||||||
# continue if invalide osm-type
|
# continue if invalid osm-type
|
||||||
continue
|
continue
|
||||||
|
|
||||||
url = result_base_url.format(osm_type=osm_type,
|
url = result_base_url.format(osm_type=osm_type,
|
||||||
|
|
|
@ -71,7 +71,7 @@ def response(resp):
|
||||||
if 'downloadUrl' in result:
|
if 'downloadUrl' in result:
|
||||||
new_result['torrentfile'] = result['downloadUrl']
|
new_result['torrentfile'] = result['downloadUrl']
|
||||||
|
|
||||||
# magnet link *may* be in guid, but it may be also idential to infoUrl
|
# magnet link *may* be in guid, but it may be also identical to infoUrl
|
||||||
if 'guid' in result and isinstance(result['guid'], str) and result['guid'].startswith('magnet'):
|
if 'guid' in result and isinstance(result['guid'], str) and result['guid'].startswith('magnet'):
|
||||||
new_result['magnetlink'] = result['guid']
|
new_result['magnetlink'] = result['guid']
|
||||||
|
|
||||||
|
|
|
@ -23,6 +23,7 @@ categories = ['music']
|
||||||
paging = True
|
paging = True
|
||||||
api_client_id = None
|
api_client_id = None
|
||||||
api_client_secret = None
|
api_client_secret = None
|
||||||
|
timeout = 10.0
|
||||||
|
|
||||||
# search-url
|
# search-url
|
||||||
url = 'https://api.spotify.com/'
|
url = 'https://api.spotify.com/'
|
||||||
|
@ -40,9 +41,10 @@ def request(query, params):
|
||||||
|
|
||||||
r = requests.post(
|
r = requests.post(
|
||||||
'https://accounts.spotify.com/api/token',
|
'https://accounts.spotify.com/api/token',
|
||||||
|
timeout=timeout,
|
||||||
data={'grant_type': 'client_credentials'},
|
data={'grant_type': 'client_credentials'},
|
||||||
headers={'Authorization': 'Basic ' + base64.b64encode(
|
headers={'Authorization': 'Basic ' + base64.b64encode(
|
||||||
"{}:{}".format(api_client_id, api_client_secret).encode()
|
"{}:{}".format(api_client_id, api_client_secret).encode(),
|
||||||
).decode()}
|
).decode()}
|
||||||
)
|
)
|
||||||
j = loads(r.text)
|
j = loads(r.text)
|
||||||
|
|
|
@ -51,7 +51,7 @@ search_url = base_url + 'sp/search?'
|
||||||
|
|
||||||
# specific xpath variables
|
# specific xpath variables
|
||||||
# ads xpath //div[@id="results"]/div[@id="sponsored"]//div[@class="result"]
|
# ads xpath //div[@id="results"]/div[@id="sponsored"]//div[@class="result"]
|
||||||
# not ads: div[@class="result"] are the direct childs of div[@id="results"]
|
# not ads: div[@class="result"] are the direct children of div[@id="results"]
|
||||||
results_xpath = '//div[@class="w-gl__result__main"]'
|
results_xpath = '//div[@class="w-gl__result__main"]'
|
||||||
link_xpath = './/a[@class="w-gl__result-title result-link"]'
|
link_xpath = './/a[@class="w-gl__result-title result-link"]'
|
||||||
content_xpath = './/p[@class="w-gl__description"]'
|
content_xpath = './/p[@class="w-gl__description"]'
|
||||||
|
@ -91,15 +91,13 @@ def get_sc_code(headers):
|
||||||
dom = html.fromstring(resp.text)
|
dom = html.fromstring(resp.text)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# href --> '/?sc=adrKJMgF8xwp20'
|
sc_code = eval_xpath(dom, '//input[@name="sc"]')[0].get('value')
|
||||||
href = eval_xpath(dom, '//a[@class="footer-home__logo"]')[0].get('href')
|
|
||||||
except IndexError as exc:
|
except IndexError as exc:
|
||||||
# suspend startpage API --> https://github.com/searxng/searxng/pull/695
|
# suspend startpage API --> https://github.com/searxng/searxng/pull/695
|
||||||
raise SearxEngineResponseException(
|
raise SearxEngineResponseException(
|
||||||
suspended_time=7 * 24 * 3600, message="PR-695: query new sc time-stamp failed!"
|
suspended_time=7 * 24 * 3600, message="PR-695: query new sc time-stamp failed!"
|
||||||
) from exc
|
) from exc
|
||||||
|
|
||||||
sc_code = href[5:]
|
|
||||||
sc_code_ts = time()
|
sc_code_ts = time()
|
||||||
logger.debug("new value is: %s", sc_code)
|
logger.debug("new value is: %s", sc_code)
|
||||||
|
|
||||||
|
@ -216,7 +214,7 @@ def _fetch_supported_languages(resp):
|
||||||
# native name, the English name of the writing script used by the language,
|
# native name, the English name of the writing script used by the language,
|
||||||
# or occasionally something else entirely.
|
# or occasionally something else entirely.
|
||||||
|
|
||||||
# this cases are so special they need to be hardcoded, a couple of them are mispellings
|
# this cases are so special they need to be hardcoded, a couple of them are misspellings
|
||||||
language_names = {
|
language_names = {
|
||||||
'english_uk': 'en-GB',
|
'english_uk': 'en-GB',
|
||||||
'fantizhengwen': ['zh-TW', 'zh-HK'],
|
'fantizhengwen': ['zh-TW', 'zh-HK'],
|
||||||
|
|
|
@ -49,7 +49,7 @@ WIKIDATA_PROPERTIES = {
|
||||||
# SERVICE wikibase:label: https://en.wikibooks.org/wiki/SPARQL/SERVICE_-_Label#Manual_Label_SERVICE
|
# SERVICE wikibase:label: https://en.wikibooks.org/wiki/SPARQL/SERVICE_-_Label#Manual_Label_SERVICE
|
||||||
# https://en.wikibooks.org/wiki/SPARQL/WIKIDATA_Precision,_Units_and_Coordinates
|
# https://en.wikibooks.org/wiki/SPARQL/WIKIDATA_Precision,_Units_and_Coordinates
|
||||||
# https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Data_model
|
# https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Data_model
|
||||||
# optmization:
|
# optimization:
|
||||||
# * https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/query_optimization
|
# * https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/query_optimization
|
||||||
# * https://github.com/blazegraph/database/wiki/QueryHints
|
# * https://github.com/blazegraph/database/wiki/QueryHints
|
||||||
QUERY_TEMPLATE = """
|
QUERY_TEMPLATE = """
|
||||||
|
@ -335,7 +335,7 @@ def get_attributes(language):
|
||||||
add_amount('P2046') # area
|
add_amount('P2046') # area
|
||||||
add_amount('P281') # postal code
|
add_amount('P281') # postal code
|
||||||
add_label('P38') # currency
|
add_label('P38') # currency
|
||||||
add_amount('P2048') # heigth (building)
|
add_amount('P2048') # height (building)
|
||||||
|
|
||||||
# Media
|
# Media
|
||||||
for p in ['P400', # platform (videogames, computing)
|
for p in ['P400', # platform (videogames, computing)
|
||||||
|
|
|
@ -50,7 +50,7 @@ def request(query, params):
|
||||||
|
|
||||||
# replace private user area characters to make text legible
|
# replace private user area characters to make text legible
|
||||||
def replace_pua_chars(text):
|
def replace_pua_chars(text):
|
||||||
pua_chars = {'\uf522': '\u2192', # rigth arrow
|
pua_chars = {'\uf522': '\u2192', # right arrow
|
||||||
'\uf7b1': '\u2115', # set of natural numbers
|
'\uf7b1': '\u2115', # set of natural numbers
|
||||||
'\uf7b4': '\u211a', # set of rational numbers
|
'\uf7b4': '\u211a', # set of rational numbers
|
||||||
'\uf7b5': '\u211d', # set of real numbers
|
'\uf7b5': '\u211d', # set of real numbers
|
||||||
|
|
|
@ -35,7 +35,7 @@ time_range_support = False
|
||||||
|
|
||||||
time_range_url = '&hours={time_range_val}'
|
time_range_url = '&hours={time_range_val}'
|
||||||
'''Time range URL parameter in the in :py:obj:`search_url`. If no time range is
|
'''Time range URL parameter in the in :py:obj:`search_url`. If no time range is
|
||||||
requested by the user, the URL paramter is an empty string. The
|
requested by the user, the URL parameter is an empty string. The
|
||||||
``{time_range_val}`` replacement is taken from the :py:obj:`time_range_map`.
|
``{time_range_val}`` replacement is taken from the :py:obj:`time_range_map`.
|
||||||
|
|
||||||
.. code:: yaml
|
.. code:: yaml
|
||||||
|
|
|
@ -30,7 +30,7 @@ def get_external_url(url_id, item_id, alternative="default"):
|
||||||
"""Return an external URL or None if url_id is not found.
|
"""Return an external URL or None if url_id is not found.
|
||||||
|
|
||||||
url_id can take value from data/external_urls.json
|
url_id can take value from data/external_urls.json
|
||||||
The "imdb_id" value is automaticaly converted according to the item_id value.
|
The "imdb_id" value is automatically converted according to the item_id value.
|
||||||
|
|
||||||
If item_id is None, the raw URL with the $1 is returned.
|
If item_id is None, the raw URL with the $1 is returned.
|
||||||
"""
|
"""
|
||||||
|
|
|
@ -78,7 +78,7 @@ def load_single_https_ruleset(rules_path):
|
||||||
rules = []
|
rules = []
|
||||||
exclusions = []
|
exclusions = []
|
||||||
|
|
||||||
# parse childs from ruleset
|
# parse children from ruleset
|
||||||
for ruleset in root:
|
for ruleset in root:
|
||||||
# this child define a target
|
# this child define a target
|
||||||
if ruleset.tag == 'target':
|
if ruleset.tag == 'target':
|
||||||
|
|
|
@ -2435,7 +2435,7 @@
|
||||||
<rule from="^http://widgets\.yahoo\.com/[^?]*"
|
<rule from="^http://widgets\.yahoo\.com/[^?]*"
|
||||||
to="https://www.yahoo.com/" />
|
to="https://www.yahoo.com/" />
|
||||||
|
|
||||||
<rule from="^http://((?:\w\w|fr-ca\.actualites|address|\w\w\.address|admanager|(?:\w\w|global)\.adserver|adspecs|\w+\.adspecs|\w+\.adspecs-new|advertising|\w\w\.advertising|beap\.adx|c5a?\.ah|(?:s-)?cookex\.amp|(?:[aosz]|apac|y3?)\.analytics|anc|answers|(?:\w\w|espanol|malaysia)\.answers|antispam|\w\w\.antispam|vn\.antoan|au\.apps|global\.ard|astrology|\w\w\.astrology|hk\.(?:(?:info|f1\.master|f1\.page|search|store|edit\.store|user)\.)?auctions|autos|\w\w\.autos|ar\.ayuda|(?:clicks\.beap|csc\.beap|pn1|row|us)\.bc|tw\.bid|tw\.(?:campaign|master|mb|page|search|store|user)\.bid|(?:m\.)?tw\.bigdeals|tw\.billing|biz|boss|(?:tw\.partner|tw)\.buy|(?:\w\w\.)?calendar|careers|\w\w\.cars|(?:\w\w|es-us)\.celebridades|(?:\w\w\.)?celebrity|tw\.charity|i?chart|(?:\w\w|es-us)\.cine|\w\w\.cinema|(?:\w\w|es-us)\.clima|migration\.cn|(?:deveopers\.)?commercecentral|br\.contribuidores|(?:uk\.)?contributor|au\.dating|(?:\w\w|es-us)\.deportes|developer|tw\.dictionary|dir|downloads|s-b\.dp|(?:eu\.|na\.|sa\.|tw\.)?edit|tw\.(?:ysm\.)?emarketing|en-maktoob|\w\w\.entertainment|espanol|edit\.europe|eurosport|(?:de|es|it|uk)\.eurosport|everything|\w\w\.everything|\w+\.fantasysports|au\.fango|tw\.fashion|br\.financas|finance|(?:\w\w|tw\.chart|espanol|tw\.futures|streamerapi)\.finance|(?:\w\w|es-us)\.finanzas|nz\.rss\.food|nz\.forums|games|(?:au|ca|uk)\.games|geo|gma|groups|(?:\w\w|asia|espanol|es-us|fr-ca|moderators)\.groups|health|help|(?:\w\w|secure)\.help|homes|(?:tw|tw\.v2)\.house|info|\w\w\.info|tw\.tool\.ks|au\.launch|legalredirect|(?:\w\w)\.lifestyle|(?:gh\.bouncer\.)?login|us\.l?rd|local|\w\w\.local|m|r\.m|\w\w\.m|mail|(?:\w\w\.overview|[\w-]+(?:\.c\.yom)?)\.mail|maktoob|malaysia|tw\.(?:user\.)?mall|maps|(?:\w\w|espanol|sgws2)\.maps|messenger|(?:\w\w|malaysia)\.messenger|\w\w\.meteo|mlogin|mobile|(?:\w\w|espanol|malaysia)\.mobile|tw\.(?:campaign\.)?money|tw\.movie|movies|(?:au|ca|nz|au\.rss|nz\.rss|tw|uk)\.movies|[\w.-]+\.msg|(?:\w\w|es-us)\.mujer|music|ca\.music|[\w-]+\.musica|my|us\.my|de\.nachrichten|ucs\.netsvs|news|(?:au|ca|fr|gr|hk|in|nz|ph|nz\.rss|sg|tw|uk)\.news|cookiex\.ngd|(?:\w\w|es-us)\.noticias|omg|(?:\w\w|es-us)\.omg|au\.oztips|rtb\.pclick|pilotx1|pipes|play|playerio|privacy|profile|tw\.promo|(?:au|hk|nz)\.promotions|publishing|(?:analytics|mailapps|media|ucs|us-locdrop|video)\.query|hk\.rd|(?:\w\w\.|fr-ca\.)?safely|screen|(?:\w\w|es-us)\.screen|scribe|search|(?:\w\w|w\w\.blog|\w\w\.dictionary|finance|\w\w\.finance|images|\w\w\.images|\w\w\.knowledge|\w\w\.lifestyle|\w\w\.local|malaysia|movies|\w\w\.movies|news|\w\w\.news|malaysia\.news|r|recipes|\w\w\.recipes|shine|shopping|\w\w\.shopping|sports|\w\w\.sports|tools|au\.tv|video|\w\w\.video|malaysia\.video)\.search|sec|rtb\.pclick\.secure|security|tw\.security|\w\w\.seguranca|\w\w\.seguridad|es-us\.seguridad|\w\w\.seguro|tw\.serviceplus|settings|shine|ca\.shine|shopping|ca\.shopping|\w+\.sitios|dashboard\.slingstone|(?:au\.|order\.)?smallbusiness|smarttv|rd\.software|de\.spiele|sports|(?:au|ca|fr|hk|nz|ph|profiles|au\.rss|nz\.rss|tw)\.sports|tw\.stock|au\.thehype|\w\w\.tiempo|es\.todo|toolbar|(?:\w\w|data|malaysia)\.toolbar|(?:au|nz)\.totaltravel|transparency|travel|tw\.travel||tv|(?:ar|au|de|fr|es|es-us|it|mx|nz|au\.rss|uk)\.tv|tw\.uwant|(?:mh|nz|qos|yep)\.video|weather|(?:au|ca|hk|in|nz|sg|ph|uk|us)\.weather|de\.wetter|www|au\.yel|video\.media\.yql|dmros\.ysm)\.)?yahoo\.com/"
|
<rule from="^http://((?:\w\w|fr-ca\.actualites|address|\w\w\.address|admanager|(?:\w\w|global)\.adserver|adspecs|\w+\.adspecs|\w+\.adspecs-new|advertising|\w\w\.advertising|beap\.adx|c5a?\.ah|(?:s-)?cookex\.amp|(?:[aosz]|apac|y3?)\.analytics|anc|answers|(?:\w\w|espanol|malaysia)\.answers|antispam|\w\w\.antispam|vn\.antoan|au\.apps|global\.ard|astrology|\w\w\.astrology|hk\.(?:(?:info|f1\.master|f1\.page|search|store|edit\.store|user)\.)?auctions|autos|\w\w\.autos|ar\.ayuda|(?:clicks\.beap|csc\.beap|pn1|row|us)\.bc|tw\.bid|tw\.(?:campaign|master|mb|page|search|store|user)\.bid|(?:m\.)?tw\.bigdeals|tw\.billing|biz|boss|(?:tw\.partner|tw)\.buy|(?:\w\w\.)?calendar|careers|\w\w\.cars|(?:\w\w|es-us)\.celebridades|(?:\w\w\.)?celebrity|tw\.charity|i?chart|(?:\w\w|es-us)\.cine|\w\w\.cinema|(?:\w\w|es-us)\.clima|migration\.cn|(?:developers\.)?commercecentral|br\.contribuidores|(?:uk\.)?contributor|au\.dating|(?:\w\w|es-us)\.deportes|developer|tw\.dictionary|dir|downloads|s-b\.dp|(?:eu\.|na\.|sa\.|tw\.)?edit|tw\.(?:ysm\.)?emarketing|en-maktoob|\w\w\.entertainment|espanol|edit\.europe|eurosport|(?:de|es|it|uk)\.eurosport|everything|\w\w\.everything|\w+\.fantasysports|au\.fango|tw\.fashion|br\.financas|finance|(?:\w\w|tw\.chart|espanol|tw\.futures|streamerapi)\.finance|(?:\w\w|es-us)\.finanzas|nz\.rss\.food|nz\.forums|games|(?:au|ca|uk)\.games|geo|gma|groups|(?:\w\w|asia|espanol|es-us|fr-ca|moderators)\.groups|health|help|(?:\w\w|secure)\.help|homes|(?:tw|tw\.v2)\.house|info|\w\w\.info|tw\.tool\.ks|au\.launch|legalredirect|(?:\w\w)\.lifestyle|(?:gh\.bouncer\.)?login|us\.l?rd|local|\w\w\.local|m|r\.m|\w\w\.m|mail|(?:\w\w\.overview|[\w-]+(?:\.c\.yom)?)\.mail|maktoob|malaysia|tw\.(?:user\.)?mall|maps|(?:\w\w|espanol|sgws2)\.maps|messenger|(?:\w\w|malaysia)\.messenger|\w\w\.meteo|mlogin|mobile|(?:\w\w|espanol|malaysia)\.mobile|tw\.(?:campaign\.)?money|tw\.movie|movies|(?:au|ca|nz|au\.rss|nz\.rss|tw|uk)\.movies|[\w.-]+\.msg|(?:\w\w|es-us)\.mujer|music|ca\.music|[\w-]+\.musica|my|us\.my|de\.nachrichten|ucs\.netsvs|news|(?:au|ca|fr|gr|hk|in|nz|ph|nz\.rss|sg|tw|uk)\.news|cookiex\.ngd|(?:\w\w|es-us)\.noticias|omg|(?:\w\w|es-us)\.omg|au\.oztips|rtb\.pclick|pilotx1|pipes|play|playerio|privacy|profile|tw\.promo|(?:au|hk|nz)\.promotions|publishing|(?:analytics|mailapps|media|ucs|us-locdrop|video)\.query|hk\.rd|(?:\w\w\.|fr-ca\.)?safely|screen|(?:\w\w|es-us)\.screen|scribe|search|(?:\w\w|w\w\.blog|\w\w\.dictionary|finance|\w\w\.finance|images|\w\w\.images|\w\w\.knowledge|\w\w\.lifestyle|\w\w\.local|malaysia|movies|\w\w\.movies|news|\w\w\.news|malaysia\.news|r|recipes|\w\w\.recipes|shine|shopping|\w\w\.shopping|sports|\w\w\.sports|tools|au\.tv|video|\w\w\.video|malaysia\.video)\.search|sec|rtb\.pclick\.secure|security|tw\.security|\w\w\.seguranca|\w\w\.seguridad|es-us\.seguridad|\w\w\.seguro|tw\.serviceplus|settings|shine|ca\.shine|shopping|ca\.shopping|\w+\.sitios|dashboard\.slingstone|(?:au\.|order\.)?smallbusiness|smarttv|rd\.software|de\.spiele|sports|(?:au|ca|fr|hk|nz|ph|profiles|au\.rss|nz\.rss|tw)\.sports|tw\.stock|au\.thehype|\w\w\.tiempo|es\.todo|toolbar|(?:\w\w|data|malaysia)\.toolbar|(?:au|nz)\.totaltravel|transparency|travel|tw\.travel||tv|(?:ar|au|de|fr|es|es-us|it|mx|nz|au\.rss|uk)\.tv|tw\.uwant|(?:mh|nz|qos|yep)\.video|weather|(?:au|ca|hk|in|nz|sg|ph|uk|us)\.weather|de\.wetter|www|au\.yel|video\.media\.yql|dmros\.ysm)\.)?yahoo\.com/"
|
||||||
to="https://$1yahoo.com/" />
|
to="https://$1yahoo.com/" />
|
||||||
|
|
||||||
<rule from="^http://([\w-]+)\.yahoofs\.com/"
|
<rule from="^http://([\w-]+)\.yahoofs\.com/"
|
||||||
|
|
|
@ -11,7 +11,11 @@ default_on = False
|
||||||
|
|
||||||
def on_result(request, search, result):
|
def on_result(request, search, result):
|
||||||
q = search.search_query.query
|
q = search.search_query.query
|
||||||
qs = shlex.split(q)
|
# WARN: shlex.quote is designed only for Unix shells and may be vulnerable
|
||||||
|
# to command injection on non-POSIX compliant shells (Windows)
|
||||||
|
# https://docs.python.org/3/library/shlex.html#shlex.quote
|
||||||
|
squote = shlex.quote(q)
|
||||||
|
qs = shlex.split(squote)
|
||||||
spitems = [x.lower() for x in qs if ' ' in x]
|
spitems = [x.lower() for x in qs if ' ' in x]
|
||||||
mitems = [x.lower() for x in qs if x.startswith('-')]
|
mitems = [x.lower() for x in qs if x.startswith('-')]
|
||||||
siteitems = [x.lower() for x in qs if x.startswith('site:')]
|
siteitems = [x.lower() for x in qs if x.startswith('site:')]
|
||||||
|
|
|
@ -97,7 +97,7 @@ class SessionSinglePool(requests.Session):
|
||||||
self.mount('http://', http_adapter)
|
self.mount('http://', http_adapter)
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
"""Call super, but clear adapters since there are managed globaly"""
|
"""Call super, but clear adapters since there are managed globally"""
|
||||||
self.adapters.clear()
|
self.adapters.clear()
|
||||||
super().close()
|
super().close()
|
||||||
|
|
||||||
|
|
|
@ -62,7 +62,7 @@ class Setting:
|
||||||
return self.value
|
return self.value
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
|
|
||||||
If needed, its overwritten in the inheritance."""
|
If needed, its overwritten in the inheritance."""
|
||||||
resp.set_cookie(name, self.value, max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, self.value, max_age=COOKIE_MAX_AGE)
|
||||||
|
@ -125,7 +125,7 @@ class MultipleChoiceSetting(EnumStringSetting):
|
||||||
self.value.append(choice)
|
self.value.append(choice)
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
"""
|
"""
|
||||||
resp.set_cookie(name, ','.join(self.value), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, ','.join(self.value), max_age=COOKIE_MAX_AGE)
|
||||||
|
|
||||||
|
@ -160,7 +160,7 @@ class SetSetting(Setting):
|
||||||
self.values = set(elements) # pylint: disable=attribute-defined-outside-init
|
self.values = set(elements) # pylint: disable=attribute-defined-outside-init
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
"""
|
"""
|
||||||
resp.set_cookie(name, ','.join(self.values), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, ','.join(self.values), max_age=COOKIE_MAX_AGE)
|
||||||
|
|
||||||
|
@ -209,7 +209,7 @@ class MapSetting(Setting):
|
||||||
self.key = data # pylint: disable=attribute-defined-outside-init
|
self.key = data # pylint: disable=attribute-defined-outside-init
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
"""
|
"""
|
||||||
if hasattr(self, 'key'):
|
if hasattr(self, 'key'):
|
||||||
resp.set_cookie(name, self.key, max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, self.key, max_age=COOKIE_MAX_AGE)
|
||||||
|
@ -253,7 +253,7 @@ class SwitchableSetting(Setting):
|
||||||
self.enabled.add(choice['id'])
|
self.enabled.add(choice['id'])
|
||||||
|
|
||||||
def save(self, resp): # pylint: disable=arguments-differ
|
def save(self, resp): # pylint: disable=arguments-differ
|
||||||
"""Save cookie in the HTTP reponse obect
|
"""Save cookie in the HTTP response object
|
||||||
"""
|
"""
|
||||||
resp.set_cookie('disabled_{0}'.format(self.value), ','.join(self.disabled), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie('disabled_{0}'.format(self.value), ','.join(self.disabled), max_age=COOKIE_MAX_AGE)
|
||||||
resp.set_cookie('enabled_{0}'.format(self.value), ','.join(self.enabled), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie('enabled_{0}'.format(self.value), ','.join(self.enabled), max_age=COOKIE_MAX_AGE)
|
||||||
|
@ -517,7 +517,7 @@ class Preferences:
|
||||||
return ret_val
|
return ret_val
|
||||||
|
|
||||||
def save(self, resp):
|
def save(self, resp):
|
||||||
"""Save cookie in the HTTP reponse obect
|
"""Save cookie in the HTTP response object
|
||||||
"""
|
"""
|
||||||
for user_setting_name, user_setting in self.key_value_settings.items():
|
for user_setting_name, user_setting in self.key_value_settings.items():
|
||||||
if user_setting.locked:
|
if user_setting.locked:
|
||||||
|
|
|
@ -197,10 +197,10 @@ class BangParser(QueryPartParser):
|
||||||
self.raw_text_query.enginerefs.append(EngineRef(value, 'none'))
|
self.raw_text_query.enginerefs.append(EngineRef(value, 'none'))
|
||||||
return True
|
return True
|
||||||
|
|
||||||
# check if prefix is equal with categorie name
|
# check if prefix is equal with category name
|
||||||
if value in categories:
|
if value in categories:
|
||||||
# using all engines for that search, which
|
# using all engines for that search, which
|
||||||
# are declared under that categorie name
|
# are declared under that category name
|
||||||
self.raw_text_query.enginerefs.extend(EngineRef(engine.name, value)
|
self.raw_text_query.enginerefs.extend(EngineRef(engine.name, value)
|
||||||
for engine in categories[value]
|
for engine in categories[value]
|
||||||
if (engine.name, value) not in self.raw_text_query.disabled_engines)
|
if (engine.name, value) not in self.raw_text_query.disabled_engines)
|
||||||
|
@ -216,7 +216,7 @@ class BangParser(QueryPartParser):
|
||||||
self._add_autocomplete(first_char + suggestion)
|
self._add_autocomplete(first_char + suggestion)
|
||||||
return
|
return
|
||||||
|
|
||||||
# check if query starts with categorie name
|
# check if query starts with category name
|
||||||
for category in categories:
|
for category in categories:
|
||||||
if category.startswith(value):
|
if category.startswith(value):
|
||||||
self._add_autocomplete(first_char + category)
|
self._add_autocomplete(first_char + category)
|
||||||
|
@ -309,7 +309,7 @@ class RawTextQuery:
|
||||||
|
|
||||||
def getFullQuery(self):
|
def getFullQuery(self):
|
||||||
"""
|
"""
|
||||||
get full querry including whitespaces
|
get full query including whitespaces
|
||||||
"""
|
"""
|
||||||
return '{0} {1}'.format(' '.join(self.query_parts), self.getQuery()).strip()
|
return '{0} {1}'.format(' '.join(self.query_parts), self.getQuery()).strip()
|
||||||
|
|
||||||
|
|
|
@ -143,9 +143,9 @@ def result_score(result, language):
|
||||||
if language in domain_parts:
|
if language in domain_parts:
|
||||||
weight *= 1.1
|
weight *= 1.1
|
||||||
|
|
||||||
occurences = len(result['positions'])
|
occurrences = len(result['positions'])
|
||||||
|
|
||||||
return sum((occurences * weight) / position for position in result['positions'])
|
return sum((occurrences * weight) / position for position in result['positions'])
|
||||||
|
|
||||||
|
|
||||||
class ResultContainer:
|
class ResultContainer:
|
||||||
|
@ -252,7 +252,7 @@ class ResultContainer:
|
||||||
|
|
||||||
result['engines'] = set([result['engine']])
|
result['engines'] = set([result['engine']])
|
||||||
|
|
||||||
# strip multiple spaces and cariage returns from content
|
# strip multiple spaces and carriage returns from content
|
||||||
if result.get('content'):
|
if result.get('content'):
|
||||||
result['content'] = WHITESPACE_REGEX.sub(' ', result['content'])
|
result['content'] = WHITESPACE_REGEX.sub(' ', result['content'])
|
||||||
|
|
||||||
|
@ -278,7 +278,7 @@ class ResultContainer:
|
||||||
return merged_result
|
return merged_result
|
||||||
else:
|
else:
|
||||||
# it's an image
|
# it's an image
|
||||||
# it's a duplicate if the parsed_url, template and img_src are differents
|
# it's a duplicate if the parsed_url, template and img_src are different
|
||||||
if result.get('img_src', '') == merged_result.get('img_src', ''):
|
if result.get('img_src', '') == merged_result.get('img_src', ''):
|
||||||
return merged_result
|
return merged_result
|
||||||
return None
|
return None
|
||||||
|
|
|
@ -60,7 +60,7 @@ def run(engine_name_list, verbose):
|
||||||
stderr.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}Checking\n')
|
stderr.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}Checking\n')
|
||||||
checker = searx.search.checker.Checker(processor)
|
checker = searx.search.checker.Checker(processor)
|
||||||
checker.run()
|
checker.run()
|
||||||
if checker.test_results.succesfull:
|
if checker.test_results.successful:
|
||||||
stdout.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}{GREEN}OK{RESET_SEQ}\n')
|
stdout.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}{GREEN}OK{RESET_SEQ}\n')
|
||||||
if verbose:
|
if verbose:
|
||||||
stdout.write(f' {"found languages":15}: {" ".join(sorted(list(checker.test_results.languages)))}\n')
|
stdout.write(f' {"found languages":15}: {" ".join(sorted(list(checker.test_results.languages)))}\n')
|
||||||
|
|
|
@ -59,7 +59,7 @@ def run():
|
||||||
logger.debug('Checking %s engine', name)
|
logger.debug('Checking %s engine', name)
|
||||||
checker = Checker(processor)
|
checker = Checker(processor)
|
||||||
checker.run()
|
checker.run()
|
||||||
if checker.test_results.succesfull:
|
if checker.test_results.successful:
|
||||||
result['engines'][name] = {'success': True}
|
result['engines'][name] = {'success': True}
|
||||||
else:
|
else:
|
||||||
result['engines'][name] = {'success': False, 'errors': checker.test_results.errors}
|
result['engines'][name] = {'success': False, 'errors': checker.test_results.errors}
|
||||||
|
|
|
@ -146,7 +146,7 @@ class TestResults:
|
||||||
self.languages.add(language)
|
self.languages.add(language)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def succesfull(self):
|
def successful(self):
|
||||||
return len(self.errors) == 0
|
return len(self.errors) == 0
|
||||||
|
|
||||||
def __iter__(self):
|
def __iter__(self):
|
||||||
|
@ -291,7 +291,7 @@ class ResultContainerTests:
|
||||||
self._record_error('No result')
|
self._record_error('No result')
|
||||||
|
|
||||||
def one_title_contains(self, title: str):
|
def one_title_contains(self, title: str):
|
||||||
"""Check one of the title contains `title` (case insensitive comparaison)"""
|
"""Check one of the title contains `title` (case insensitive comparison)"""
|
||||||
title = title.lower()
|
title = title.lower()
|
||||||
for result in self.result_container.get_ordered_results():
|
for result in self.result_container.get_ordered_results():
|
||||||
if title in result['title'].lower():
|
if title in result['title'].lower():
|
||||||
|
|
|
@ -56,7 +56,7 @@ class OnlineProcessor(EngineProcessor):
|
||||||
|
|
||||||
def _send_http_request(self, params):
|
def _send_http_request(self, params):
|
||||||
# create dictionary which contain all
|
# create dictionary which contain all
|
||||||
# informations about the request
|
# information about the request
|
||||||
request_args = dict(
|
request_args = dict(
|
||||||
headers=params['headers'],
|
headers=params['headers'],
|
||||||
cookies=params['cookies'],
|
cookies=params['cookies'],
|
||||||
|
|
|
@ -19,7 +19,7 @@ search:
|
||||||
default_lang : "" # Default search language - leave blank to detect from browser information or use codes from 'languages.py'
|
default_lang : "" # Default search language - leave blank to detect from browser information or use codes from 'languages.py'
|
||||||
ban_time_on_fail : 5 # ban time in seconds after engine errors
|
ban_time_on_fail : 5 # ban time in seconds after engine errors
|
||||||
max_ban_time_on_fail : 120 # max ban time in seconds after engine errors
|
max_ban_time_on_fail : 120 # max ban time in seconds after engine errors
|
||||||
prefer_configured_language: False # increase weight of results in confiugred language in ranking
|
prefer_configured_language: False # increase weight of results in configured language in ranking
|
||||||
|
|
||||||
server:
|
server:
|
||||||
port : 8888
|
port : 8888
|
||||||
|
@ -73,7 +73,7 @@ ui:
|
||||||
outgoing: # communication with search engines
|
outgoing: # communication with search engines
|
||||||
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
||||||
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
||||||
useragent_suffix : "" # suffix of searx_useragent, could contain informations like an email address to the administrator
|
useragent_suffix : "" # suffix of searx_useragent, could contain information like an email address to the administrator
|
||||||
pool_connections : 100 # Number of different hosts
|
pool_connections : 100 # Number of different hosts
|
||||||
pool_maxsize : 10 # Number of simultaneous requests by host
|
pool_maxsize : 10 # Number of simultaneous requests by host
|
||||||
# uncomment below section if you want to use a proxy
|
# uncomment below section if you want to use a proxy
|
||||||
|
@ -740,6 +740,13 @@ engines:
|
||||||
shortcut: iv
|
shortcut: iv
|
||||||
timeout : 5.0
|
timeout : 5.0
|
||||||
disabled : True
|
disabled : True
|
||||||
|
|
||||||
|
- name: ipfs search
|
||||||
|
engine: ipfs_search
|
||||||
|
shortcut: ipfs
|
||||||
|
paging: True
|
||||||
|
timeout: 5.0
|
||||||
|
disabled: True
|
||||||
|
|
||||||
- name: kickass
|
- name: kickass
|
||||||
engine : kickass
|
engine : kickass
|
||||||
|
@ -1284,7 +1291,7 @@ engines:
|
||||||
- name : wiby
|
- name : wiby
|
||||||
engine : json_engine
|
engine : json_engine
|
||||||
paging : True
|
paging : True
|
||||||
search_url : https://wiby.me/json/?q={query}&o={pageno}0
|
search_url : https://wiby.me/json/?q={query}&p={pageno}
|
||||||
url_query : URL
|
url_query : URL
|
||||||
title_query : Title
|
title_query : Title
|
||||||
content_query : Snippet
|
content_query : Snippet
|
||||||
|
@ -1654,7 +1661,7 @@ engines:
|
||||||
paging: true
|
paging: true
|
||||||
first_page_num: 0
|
first_page_num: 0
|
||||||
search_url: https://search.brave.com/search?q={query}&offset={pageno}&spellcheck=1
|
search_url: https://search.brave.com/search?q={query}&offset={pageno}&spellcheck=1
|
||||||
url_xpath: //div[@class="snippet fdb"]/a/@href
|
url_xpath: //a[@class="result-header"]/@href
|
||||||
title_xpath: //span[@class="snippet-title"]
|
title_xpath: //span[@class="snippet-title"]
|
||||||
content_xpath: //p[1][@class="snippet-description"]
|
content_xpath: //p[1][@class="snippet-description"]
|
||||||
suggestion_xpath: //div[@class="text-gray h6"]/a
|
suggestion_xpath: //div[@class="text-gray h6"]/a
|
||||||
|
@ -1677,6 +1684,15 @@ engines:
|
||||||
require_api_key: false
|
require_api_key: false
|
||||||
results: HTML
|
results: HTML
|
||||||
|
|
||||||
|
# omnom engine - see https://github.com/asciimoo/omnom for more details
|
||||||
|
# - name : omnom
|
||||||
|
# engine : omnom
|
||||||
|
# paging : True
|
||||||
|
# base_url : 'http://your.omnom.host/'
|
||||||
|
# enable_http : True
|
||||||
|
# categories : general
|
||||||
|
# shortcut : om
|
||||||
|
|
||||||
# Doku engine lets you access to any Doku wiki instance:
|
# Doku engine lets you access to any Doku wiki instance:
|
||||||
# A public one or a private/corporate one.
|
# A public one or a private/corporate one.
|
||||||
# - name : ubuntuwiki
|
# - name : ubuntuwiki
|
||||||
|
|
|
@ -37,7 +37,7 @@ def get_user_settings_path():
|
||||||
# find location of settings.yml
|
# find location of settings.yml
|
||||||
if 'SEARX_SETTINGS_PATH' in environ:
|
if 'SEARX_SETTINGS_PATH' in environ:
|
||||||
# if possible set path to settings using the
|
# if possible set path to settings using the
|
||||||
# enviroment variable SEARX_SETTINGS_PATH
|
# environment variable SEARX_SETTINGS_PATH
|
||||||
return check_settings_yml(environ['SEARX_SETTINGS_PATH'])
|
return check_settings_yml(environ['SEARX_SETTINGS_PATH'])
|
||||||
|
|
||||||
# if not, get it from /etc/searx, or last resort the codebase
|
# if not, get it from /etc/searx, or last resort the codebase
|
||||||
|
@ -132,7 +132,7 @@ def load_settings(load_user_setttings=True):
|
||||||
default_settings = load_yaml(default_settings_path)
|
default_settings = load_yaml(default_settings_path)
|
||||||
update_settings(default_settings, user_settings)
|
update_settings(default_settings, user_settings)
|
||||||
return (default_settings,
|
return (default_settings,
|
||||||
'merge the default settings ( {} ) and the user setttings ( {} )'
|
'merge the default settings ( {} ) and the user settings ( {} )'
|
||||||
.format(default_settings_path, user_settings_path))
|
.format(default_settings_path, user_settings_path))
|
||||||
|
|
||||||
# the user settings, fully replace the default configuration
|
# the user settings, fully replace the default configuration
|
||||||
|
|
|
@ -96,7 +96,7 @@
|
||||||
|
|
||||||
{% if 'method' not in locked_preferences %}
|
{% if 'method' not in locked_preferences %}
|
||||||
{% set method_label = _('Method') %}
|
{% set method_label = _('Method') %}
|
||||||
{% set method_info = _('Change how forms are submited, <a href="http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol#Request_methods" rel="external">learn more about request methods</a>') %}
|
{% set method_info = _('Change how forms are submitted, <a href="http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol#Request_methods" rel="external">learn more about request methods</a>') %}
|
||||||
{{ preferences_item_header(method_info, method_label, rtl, 'method') }}
|
{{ preferences_item_header(method_info, method_label, rtl, 'method') }}
|
||||||
<select class="form-control {{ custom_select_class(rtl) }}" name="method" id="method">
|
<select class="form-control {{ custom_select_class(rtl) }}" name="method" id="method">
|
||||||
<option value="POST" {% if method == 'POST' %}selected="selected"{% endif %}>POST</option>
|
<option value="POST" {% if method == 'POST' %}selected="selected"{% endif %}>POST</option>
|
||||||
|
|
|
@ -114,6 +114,6 @@ if __name__ == '__main__':
|
||||||
run_robot_tests([getattr(robot, x) for x in dir(robot) if x.startswith('test_')])
|
run_robot_tests([getattr(robot, x) for x in dir(robot) if x.startswith('test_')])
|
||||||
except Exception: # pylint: disable=broad-except
|
except Exception: # pylint: disable=broad-except
|
||||||
errors = True
|
errors = True
|
||||||
print('Error occured: {0}'.format(traceback.format_exc()))
|
print('Error occurred: {0}'.format(traceback.format_exc()))
|
||||||
test_layer.tearDown()
|
test_layer.tearDown()
|
||||||
sys.exit(1 if errors else 0)
|
sys.exit(1 if errors else 0)
|
||||||
|
|
Binary file not shown.
|
@ -10,6 +10,7 @@
|
||||||
# Gabriel Nunes <gabriel.hkr@gmail.com>, 2017
|
# Gabriel Nunes <gabriel.hkr@gmail.com>, 2017
|
||||||
# Guimarães Mello <matheus.mello@disroot.org>, 2017
|
# Guimarães Mello <matheus.mello@disroot.org>, 2017
|
||||||
# Neton Brício <fervelinux@gmail.com>, 2015
|
# Neton Brício <fervelinux@gmail.com>, 2015
|
||||||
|
# Noémi Ványi <sitbackandwait@gmail.com>, 2022
|
||||||
# pizzaiolo, 2016
|
# pizzaiolo, 2016
|
||||||
# shizuka, 2018
|
# shizuka, 2018
|
||||||
msgid ""
|
msgid ""
|
||||||
|
@ -18,7 +19,7 @@ msgstr ""
|
||||||
"Report-Msgid-Bugs-To: EMAIL@ADDRESS\n"
|
"Report-Msgid-Bugs-To: EMAIL@ADDRESS\n"
|
||||||
"POT-Creation-Date: 2020-07-09 15:07+0200\n"
|
"POT-Creation-Date: 2020-07-09 15:07+0200\n"
|
||||||
"PO-Revision-Date: 2014-01-30 14:32+0000\n"
|
"PO-Revision-Date: 2014-01-30 14:32+0000\n"
|
||||||
"Last-Translator: André Marcelo Alvarenga <alvarenga@kde.org>, 2022\n"
|
"Last-Translator: Noémi Ványi <sitbackandwait@gmail.com>, 2022\n"
|
||||||
"Language-Team: Portuguese (Brazil) (http://www.transifex.com/asciimoo/searx/language/pt_BR/)\n"
|
"Language-Team: Portuguese (Brazil) (http://www.transifex.com/asciimoo/searx/language/pt_BR/)\n"
|
||||||
"MIME-Version: 1.0\n"
|
"MIME-Version: 1.0\n"
|
||||||
"Content-Type: text/plain; charset=UTF-8\n"
|
"Content-Type: text/plain; charset=UTF-8\n"
|
||||||
|
@ -81,7 +82,7 @@ msgstr "erro de busca"
|
||||||
|
|
||||||
#: searx/webapp.py:634
|
#: searx/webapp.py:634
|
||||||
msgid "{minutes} minute(s) ago"
|
msgid "{minutes} minute(s) ago"
|
||||||
msgstr "{minutos} minuto(s) atrás"
|
msgstr "{minutes} minuto(s) atrás"
|
||||||
|
|
||||||
#: searx/webapp.py:636
|
#: searx/webapp.py:636
|
||||||
msgid "{hours} hour(s), {minutes} minute(s) ago"
|
msgid "{hours} hour(s), {minutes} minute(s) ago"
|
||||||
|
|
|
@ -53,7 +53,7 @@ def parse_lang(preferences: Preferences, form: Dict[str, str], raw_text_query: R
|
||||||
return preferences.get_value('language')
|
return preferences.get_value('language')
|
||||||
# get language
|
# get language
|
||||||
# set specific language if set on request, query or preferences
|
# set specific language if set on request, query or preferences
|
||||||
# TODO support search with multible languages
|
# TODO support search with multiple languages
|
||||||
if len(raw_text_query.languages):
|
if len(raw_text_query.languages):
|
||||||
query_lang = raw_text_query.languages[-1]
|
query_lang = raw_text_query.languages[-1]
|
||||||
elif 'language' in form:
|
elif 'language' in form:
|
||||||
|
@ -216,7 +216,7 @@ def get_search_query_from_webapp(preferences: Preferences, form: Dict[str, str])
|
||||||
disabled_engines = preferences.engines.get_disabled()
|
disabled_engines = preferences.engines.get_disabled()
|
||||||
|
|
||||||
# parse query, if tags are set, which change
|
# parse query, if tags are set, which change
|
||||||
# the serch engine or search-language
|
# the search engine or search-language
|
||||||
raw_text_query = RawTextQuery(form['q'], disabled_engines)
|
raw_text_query = RawTextQuery(form['q'], disabled_engines)
|
||||||
|
|
||||||
# set query
|
# set query
|
||||||
|
@ -231,7 +231,7 @@ def get_search_query_from_webapp(preferences: Preferences, form: Dict[str, str])
|
||||||
|
|
||||||
if not is_locked('categories') and raw_text_query.enginerefs and raw_text_query.specific:
|
if not is_locked('categories') and raw_text_query.enginerefs and raw_text_query.specific:
|
||||||
# if engines are calculated from query,
|
# if engines are calculated from query,
|
||||||
# set categories by using that informations
|
# set categories by using that information
|
||||||
query_engineref_list = raw_text_query.enginerefs
|
query_engineref_list = raw_text_query.enginerefs
|
||||||
else:
|
else:
|
||||||
# otherwise, using defined categories to
|
# otherwise, using defined categories to
|
||||||
|
|
|
@ -225,7 +225,7 @@ def code_highlighter(codelines, language=None):
|
||||||
language = 'text'
|
language = 'text'
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# find lexer by programing language
|
# find lexer by programming language
|
||||||
lexer = get_lexer_by_name(language, stripall=True)
|
lexer = get_lexer_by_name(language, stripall=True)
|
||||||
except:
|
except:
|
||||||
# if lexer is not found, using default one
|
# if lexer is not found, using default one
|
||||||
|
@ -647,7 +647,7 @@ def search():
|
||||||
# removing html content and whitespace duplications
|
# removing html content and whitespace duplications
|
||||||
result['title'] = ' '.join(html_to_text(result['title']).strip().split())
|
result['title'] = ' '.join(html_to_text(result['title']).strip().split())
|
||||||
|
|
||||||
if 'url' in result:
|
if 'url' in result and 'pretty_url' not in result:
|
||||||
result['pretty_url'] = prettify_url(result['url'])
|
result['pretty_url'] = prettify_url(result['url'])
|
||||||
|
|
||||||
# TODO, check if timezone is calculated right
|
# TODO, check if timezone is calculated right
|
||||||
|
|
|
@ -35,7 +35,7 @@ class UnicodeWriter:
|
||||||
# Fetch UTF-8 output from the queue ...
|
# Fetch UTF-8 output from the queue ...
|
||||||
data = self.queue.getvalue()
|
data = self.queue.getvalue()
|
||||||
data = data.strip('\x00')
|
data = data.strip('\x00')
|
||||||
# ... and reencode it into the target encoding
|
# ... and re-encode it into the target encoding
|
||||||
data = self.encoder.encode(data)
|
data = self.encoder.encode(data)
|
||||||
# write to the target stream
|
# write to the target stream
|
||||||
self.stream.write(data.decode())
|
self.stream.write(data.decode())
|
||||||
|
|
|
@ -13,7 +13,7 @@ from searx.engines.wikidata import send_wikidata_query
|
||||||
|
|
||||||
|
|
||||||
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
||||||
# so multiple invokation of this script doesn't change currencies.json
|
# so multiple invocation of this script doesn't change currencies.json
|
||||||
SARQL_REQUEST = """
|
SARQL_REQUEST = """
|
||||||
SELECT DISTINCT ?iso4217 ?unit ?unicode ?label ?alias WHERE {
|
SELECT DISTINCT ?iso4217 ?unit ?unicode ?label ?alias WHERE {
|
||||||
?item wdt:P498 ?iso4217; rdfs:label ?label.
|
?item wdt:P498 ?iso4217; rdfs:label ?label.
|
||||||
|
@ -29,7 +29,7 @@ ORDER BY ?iso4217 ?unit ?unicode ?label ?alias
|
||||||
"""
|
"""
|
||||||
|
|
||||||
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
||||||
# so multiple invokation of this script doesn't change currencies.json
|
# so multiple invocation of this script doesn't change currencies.json
|
||||||
SPARQL_WIKIPEDIA_NAMES_REQUEST = """
|
SPARQL_WIKIPEDIA_NAMES_REQUEST = """
|
||||||
SELECT DISTINCT ?iso4217 ?article_name WHERE {
|
SELECT DISTINCT ?iso4217 ?article_name WHERE {
|
||||||
?item wdt:P498 ?iso4217 .
|
?item wdt:P498 ?iso4217 .
|
||||||
|
|
|
@ -30,7 +30,7 @@ HTTP_COLON = 'http:'
|
||||||
|
|
||||||
|
|
||||||
def get_bang_url():
|
def get_bang_url():
|
||||||
response = requests.get(URL_BV1)
|
response = requests.get(URL_BV1, timeout=10.0)
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
|
|
||||||
r = RE_BANG_VERSION.findall(response.text)
|
r = RE_BANG_VERSION.findall(response.text)
|
||||||
|
@ -38,7 +38,7 @@ def get_bang_url():
|
||||||
|
|
||||||
|
|
||||||
def fetch_ddg_bangs(url):
|
def fetch_ddg_bangs(url):
|
||||||
response = requests.get(url)
|
response = requests.get(url, timeout=10.0)
|
||||||
response.raise_for_status()
|
response.raise_for_status()
|
||||||
return json.loads(response.content.decode())
|
return json.loads(response.content.decode())
|
||||||
|
|
||||||
|
|
|
@ -5,7 +5,7 @@ import requests
|
||||||
import re
|
import re
|
||||||
from os.path import dirname, join
|
from os.path import dirname, join
|
||||||
from urllib.parse import urlparse, urljoin
|
from urllib.parse import urlparse, urljoin
|
||||||
from distutils.version import LooseVersion, StrictVersion
|
from packaging.version import Version, parse
|
||||||
from lxml import html
|
from lxml import html
|
||||||
from searx import searx_dir
|
from searx import searx_dir
|
||||||
|
|
||||||
|
@ -39,7 +39,7 @@ def fetch_firefox_versions():
|
||||||
if path.startswith(RELEASE_PATH):
|
if path.startswith(RELEASE_PATH):
|
||||||
version = path[len(RELEASE_PATH):-1]
|
version = path[len(RELEASE_PATH):-1]
|
||||||
if NORMAL_REGEX.match(version):
|
if NORMAL_REGEX.match(version):
|
||||||
versions.append(LooseVersion(version))
|
versions.append(Version(version))
|
||||||
|
|
||||||
list.sort(versions, reverse=True)
|
list.sort(versions, reverse=True)
|
||||||
return versions
|
return versions
|
||||||
|
@ -49,12 +49,12 @@ def fetch_firefox_last_versions():
|
||||||
versions = fetch_firefox_versions()
|
versions = fetch_firefox_versions()
|
||||||
|
|
||||||
result = []
|
result = []
|
||||||
major_last = versions[0].version[0]
|
major_last = versions[0].major
|
||||||
major_list = (major_last, major_last - 1)
|
major_list = (major_last, major_last - 1)
|
||||||
for version in versions:
|
for version in versions:
|
||||||
major_current = version.version[0]
|
major_current = version.major
|
||||||
if major_current in major_list:
|
if major_current in major_list:
|
||||||
result.append(version.vstring)
|
result.append(str(version))
|
||||||
|
|
||||||
return result
|
return result
|
||||||
|
|
||||||
|
|
|
@ -18,7 +18,7 @@ engines_languages_file = Path(searx_dir) / 'data' / 'engines_languages.json'
|
||||||
languages_file = Path(searx_dir) / 'languages.py'
|
languages_file = Path(searx_dir) / 'languages.py'
|
||||||
|
|
||||||
|
|
||||||
# Fetchs supported languages for each engine and writes json file with those.
|
# Fetches supported languages for each engine and writes json file with those.
|
||||||
def fetch_supported_languages():
|
def fetch_supported_languages():
|
||||||
|
|
||||||
engines_languages = dict()
|
engines_languages = dict()
|
||||||
|
|
20
utils/lib.sh
20
utils/lib.sh
|
@ -451,17 +451,17 @@ install_template() {
|
||||||
fi
|
fi
|
||||||
|
|
||||||
if [[ -f "${dst}" ]] && cmp --silent "${template_file}" "${dst}" ; then
|
if [[ -f "${dst}" ]] && cmp --silent "${template_file}" "${dst}" ; then
|
||||||
info_msg "file ${dst} allready installed"
|
info_msg "file ${dst} already installed"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
info_msg "diffrent file ${dst} allready exists on this host"
|
info_msg "different file ${dst} already exists on this host"
|
||||||
|
|
||||||
while true; do
|
while true; do
|
||||||
choose_one _reply "choose next step with file $dst" \
|
choose_one _reply "choose next step with file $dst" \
|
||||||
"replace file" \
|
"replace file" \
|
||||||
"leave file unchanged" \
|
"leave file unchanged" \
|
||||||
"interactiv shell" \
|
"interactive shell" \
|
||||||
"diff files"
|
"diff files"
|
||||||
|
|
||||||
case $_reply in
|
case $_reply in
|
||||||
|
@ -474,7 +474,7 @@ install_template() {
|
||||||
"leave file unchanged")
|
"leave file unchanged")
|
||||||
break
|
break
|
||||||
;;
|
;;
|
||||||
"interactiv shell")
|
"interactive shell")
|
||||||
echo -e "// edit ${_Red}${dst}${_creset} to your needs"
|
echo -e "// edit ${_Red}${dst}${_creset} to your needs"
|
||||||
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
||||||
sudo -H -u "${owner}" -i
|
sudo -H -u "${owner}" -i
|
||||||
|
@ -1018,8 +1018,8 @@ nginx_install_app() {
|
||||||
|
|
||||||
nginx_include_apps_enabled() {
|
nginx_include_apps_enabled() {
|
||||||
|
|
||||||
# Add the *NGINX_APPS_ENABLED* infrastruture to a nginx server block. Such
|
# Add the *NGINX_APPS_ENABLED* infrastructure to a nginx server block. Such
|
||||||
# infrastruture is already known from fedora and centos, including apps (location
|
# infrastructure is already known from fedora and centos, including apps (location
|
||||||
# directives) from the /etc/nginx/default.d folder into the *default* nginx
|
# directives) from the /etc/nginx/default.d folder into the *default* nginx
|
||||||
# server.
|
# server.
|
||||||
|
|
||||||
|
@ -1521,7 +1521,7 @@ _apt_pkg_info_is_updated=0
|
||||||
|
|
||||||
pkg_install() {
|
pkg_install() {
|
||||||
|
|
||||||
# usage: TITEL='install foobar' pkg_install foopkg barpkg
|
# usage: TITLE='install foobar' pkg_install foopkg barpkg
|
||||||
|
|
||||||
rst_title "${TITLE:-installation of packages}" section
|
rst_title "${TITLE:-installation of packages}" section
|
||||||
echo -e "\npackage(s)::\n"
|
echo -e "\npackage(s)::\n"
|
||||||
|
@ -1557,7 +1557,7 @@ pkg_install() {
|
||||||
|
|
||||||
pkg_remove() {
|
pkg_remove() {
|
||||||
|
|
||||||
# usage: TITEL='remove foobar' pkg_remove foopkg barpkg
|
# usage: TITLE='remove foobar' pkg_remove foopkg barpkg
|
||||||
|
|
||||||
rst_title "${TITLE:-remove packages}" section
|
rst_title "${TITLE:-remove packages}" section
|
||||||
echo -e "\npackage(s)::\n"
|
echo -e "\npackage(s)::\n"
|
||||||
|
@ -1623,7 +1623,7 @@ git_clone() {
|
||||||
# git_clone <url> <path> [<branch> [<user>]]
|
# git_clone <url> <path> [<branch> [<user>]]
|
||||||
#
|
#
|
||||||
# First form uses $CACHE/<name> as destination folder, second form clones
|
# First form uses $CACHE/<name> as destination folder, second form clones
|
||||||
# into <path>. If repository is allready cloned, pull from <branch> and
|
# into <path>. If repository is already cloned, pull from <branch> and
|
||||||
# update working tree (if needed, the caller has to stash local changes).
|
# update working tree (if needed, the caller has to stash local changes).
|
||||||
#
|
#
|
||||||
# git clone https://github.com/searx/searx searx-src origin/master searxlogin
|
# git clone https://github.com/searx/searx searx-src origin/master searxlogin
|
||||||
|
@ -1696,7 +1696,7 @@ lxc_init_container_env() {
|
||||||
# usage: lxc_init_container_env <name>
|
# usage: lxc_init_container_env <name>
|
||||||
|
|
||||||
# Create a /.lxcenv file in the root folder. Call this once after the
|
# Create a /.lxcenv file in the root folder. Call this once after the
|
||||||
# container is inital started and before installing any boilerplate stuff.
|
# container is initial started and before installing any boilerplate stuff.
|
||||||
|
|
||||||
info_msg "create /.lxcenv in container $1"
|
info_msg "create /.lxcenv in container $1"
|
||||||
cat <<EOF | lxc exec "${1}" -- bash | prefix_stdout "[${_BBlue}${1}${_creset}] "
|
cat <<EOF | lxc exec "${1}" -- bash | prefix_stdout "[${_BBlue}${1}${_creset}] "
|
||||||
|
|
|
@ -107,7 +107,7 @@ show
|
||||||
:suite: show services of all (or <name>) containers from the LXC suite
|
:suite: show services of all (or <name>) containers from the LXC suite
|
||||||
:images: show information of local images
|
:images: show information of local images
|
||||||
cmd
|
cmd
|
||||||
use single qoutes to evaluate in container's bash, e.g.: 'echo \$(hostname)'
|
use single quotes to evaluate in container's bash, e.g.: 'echo \$(hostname)'
|
||||||
-- run command '...' in all containers of the LXC suite
|
-- run command '...' in all containers of the LXC suite
|
||||||
:<name>: run command '...' in container <name>
|
:<name>: run command '...' in container <name>
|
||||||
install
|
install
|
||||||
|
@ -178,7 +178,7 @@ main() {
|
||||||
lxc_delete_container "$2"
|
lxc_delete_container "$2"
|
||||||
fi
|
fi
|
||||||
;;
|
;;
|
||||||
*) usage "uknown or missing container <name> $2"; exit 42;;
|
*) usage "unknown or missing container <name> $2"; exit 42;;
|
||||||
esac
|
esac
|
||||||
;;
|
;;
|
||||||
start|stop)
|
start|stop)
|
||||||
|
@ -190,7 +190,7 @@ main() {
|
||||||
info_msg "lxc $1 $2"
|
info_msg "lxc $1 $2"
|
||||||
lxc "$1" "$2" | prefix_stdout "[${_BBlue}${i}${_creset}] "
|
lxc "$1" "$2" | prefix_stdout "[${_BBlue}${i}${_creset}] "
|
||||||
;;
|
;;
|
||||||
*) usage "uknown or missing container <name> $2"; exit 42;;
|
*) usage "unknown or missing container <name> $2"; exit 42;;
|
||||||
esac
|
esac
|
||||||
;;
|
;;
|
||||||
show)
|
show)
|
||||||
|
|
|
@ -402,7 +402,7 @@ EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
enable_debug() {
|
enable_debug() {
|
||||||
warn_msg "Do not enable debug in production enviroments!!"
|
warn_msg "Do not enable debug in production environments!!"
|
||||||
info_msg "Enabling debug option needs to reinstall systemd service!"
|
info_msg "Enabling debug option needs to reinstall systemd service!"
|
||||||
set_service_env_debug true
|
set_service_env_debug true
|
||||||
}
|
}
|
||||||
|
|
|
@ -436,7 +436,7 @@ install_settings() {
|
||||||
choose_one action "What should happen to the settings file? " \
|
choose_one action "What should happen to the settings file? " \
|
||||||
"keep configuration unchanged" \
|
"keep configuration unchanged" \
|
||||||
"use origin settings" \
|
"use origin settings" \
|
||||||
"start interactiv shell"
|
"start interactive shell"
|
||||||
case $action in
|
case $action in
|
||||||
"keep configuration unchanged")
|
"keep configuration unchanged")
|
||||||
info_msg "leave settings file unchanged"
|
info_msg "leave settings file unchanged"
|
||||||
|
@ -446,7 +446,7 @@ install_settings() {
|
||||||
info_msg "install origin settings"
|
info_msg "install origin settings"
|
||||||
cp "${SEARX_SETTINGS_TEMPLATE}" "${SEARX_SETTINGS_PATH}"
|
cp "${SEARX_SETTINGS_TEMPLATE}" "${SEARX_SETTINGS_PATH}"
|
||||||
;;
|
;;
|
||||||
"start interactiv shell")
|
"start interactive shell")
|
||||||
backup_file "${SEARX_SETTINGS_PATH}"
|
backup_file "${SEARX_SETTINGS_PATH}"
|
||||||
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
||||||
sudo -H -i
|
sudo -H -i
|
||||||
|
@ -533,7 +533,7 @@ EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
test_local_searx() {
|
test_local_searx() {
|
||||||
rst_title "Testing searx instance localy" section
|
rst_title "Testing searx instance locally" section
|
||||||
echo
|
echo
|
||||||
|
|
||||||
if service_is_available "http://${SEARX_INTERNAL_HTTP}" &>/dev/null; then
|
if service_is_available "http://${SEARX_INTERNAL_HTTP}" &>/dev/null; then
|
||||||
|
@ -600,7 +600,7 @@ EOF
|
||||||
}
|
}
|
||||||
|
|
||||||
enable_debug() {
|
enable_debug() {
|
||||||
warn_msg "Do not enable debug in production enviroments!!"
|
warn_msg "Do not enable debug in production environments!!"
|
||||||
info_msg "try to enable debug mode ..."
|
info_msg "try to enable debug mode ..."
|
||||||
tee_stderr 0.1 <<EOF | sudo -H -i 2>&1 | prefix_stdout "$_service_prefix"
|
tee_stderr 0.1 <<EOF | sudo -H -i 2>&1 | prefix_stdout "$_service_prefix"
|
||||||
cd ${SEARX_SRC}
|
cd ${SEARX_SRC}
|
||||||
|
@ -833,8 +833,8 @@ rst-doc() {
|
||||||
|
|
||||||
eval "echo \"$(< "${REPO_ROOT}/docs/build-templates/searx.rst")\""
|
eval "echo \"$(< "${REPO_ROOT}/docs/build-templates/searx.rst")\""
|
||||||
|
|
||||||
# I use ubuntu-20.04 here to demonstrate that versions are also suported,
|
# I use ubuntu-20.04 here to demonstrate that versions are also supported,
|
||||||
# normaly debian-* and ubuntu-* are most the same.
|
# normally debian-* and ubuntu-* are most the same.
|
||||||
|
|
||||||
for DIST_NAME in ubuntu-20.04 arch fedora; do
|
for DIST_NAME in ubuntu-20.04 arch fedora; do
|
||||||
(
|
(
|
||||||
|
|
|
@ -27,7 +27,7 @@ disable-logging = true
|
||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
|
|
@ -27,7 +27,7 @@ disable-logging = true
|
||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
|
|
@ -26,7 +26,7 @@ disable-logging = true
|
||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
|
|
@ -26,7 +26,7 @@ disable-logging = true
|
||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
|
Loading…
Reference in New Issue