mirror of
https://github.com/searx/searx
synced 2024-11-22 01:45:21 +01:00
Fix typos (#3366)
Found via `codespell -S ./searx/translations,./searx/data,./searx/static -L ans,te,fo,doubleclick,tthe,dum`
This commit is contained in:
parent
57e7e3bbf6
commit
629ebb426f
@ -424,7 +424,7 @@ Special thanks to `NLNet <https://nlnet.nl>`__ for sponsoring multiple features
|
|||||||
- Removed engines: faroo
|
- Removed engines: faroo
|
||||||
|
|
||||||
Special thanks to `NLNet <https://nlnet.nl>`__ for sponsoring multiple features of this release.
|
Special thanks to `NLNet <https://nlnet.nl>`__ for sponsoring multiple features of this release.
|
||||||
Special thanks to https://www.accessibility.nl/english for making accessibilty audit.
|
Special thanks to https://www.accessibility.nl/english for making accessibility audit.
|
||||||
|
|
||||||
News
|
News
|
||||||
~~~~
|
~~~~
|
||||||
|
@ -94,7 +94,7 @@ the new proposed features privacy respecting enough. The most significant issue
|
|||||||
engine metrics.
|
engine metrics.
|
||||||
|
|
||||||
Searx is built for privacy conscious users. It comes a unique set of
|
Searx is built for privacy conscious users. It comes a unique set of
|
||||||
challanges. One of the problems we face is that users rather not report bugs,
|
challenges. One of the problems we face is that users rather not report bugs,
|
||||||
because they do not want to publicly share what engines they use or what search
|
because they do not want to publicly share what engines they use or what search
|
||||||
query triggered a problem. It is a challenge we accepted.
|
query triggered a problem. It is a challenge we accepted.
|
||||||
|
|
||||||
@ -124,8 +124,8 @@ instances locally, instead of using public instances.
|
|||||||
Why should I use SearxNG?
|
Why should I use SearxNG?
|
||||||
#########################
|
#########################
|
||||||
|
|
||||||
SearxNG has rolling releases, depencencies updated more frequently, and engines are fixed
|
SearxNG has rolling releases, dependencies updated more frequently, and engines are fixed
|
||||||
faster. It is easy to set up your own public instance, and monitor its
|
faster. It is easy to set up your own public instance, and monitor its
|
||||||
perfomance and metrics. It is simple to maintain as an instance adminstrator.
|
performance and metrics. It is simple to maintain as an instance administrator.
|
||||||
|
|
||||||
As a user, it provides a prettier user interface and nicer experience.
|
As a user, it provides a prettier user interface and nicer experience.
|
||||||
|
@ -100,7 +100,7 @@ update_conf() {
|
|||||||
# There is a new version
|
# There is a new version
|
||||||
if [ $FORCE_CONF_UPDATE -ne 0 ]; then
|
if [ $FORCE_CONF_UPDATE -ne 0 ]; then
|
||||||
# Replace the current configuration
|
# Replace the current configuration
|
||||||
printf '⚠️ Automaticaly update %s to the new version\n' "${CONF}"
|
printf '⚠️ Automatically update %s to the new version\n' "${CONF}"
|
||||||
if [ ! -f "${OLD_CONF}" ]; then
|
if [ ! -f "${OLD_CONF}" ]; then
|
||||||
printf 'The previous configuration is saved to %s\n' "${OLD_CONF}"
|
printf 'The previous configuration is saved to %s\n' "${OLD_CONF}"
|
||||||
mv "${CONF}" "${OLD_CONF}"
|
mv "${CONF}" "${OLD_CONF}"
|
||||||
|
@ -9,7 +9,7 @@ workers = 4
|
|||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
master = true
|
master = true
|
||||||
plugin = python3
|
plugin = python3
|
||||||
|
@ -37,7 +37,7 @@ Disabled **D** Engine type **ET**
|
|||||||
------------- ----------- -------------------- ------------
|
------------- ----------- -------------------- ------------
|
||||||
Safe search **SS**
|
Safe search **SS**
|
||||||
------------- ----------- ---------------------------------
|
------------- ----------- ---------------------------------
|
||||||
Weigth **W**
|
Weight **W**
|
||||||
------------- ----------- ---------------------------------
|
------------- ----------- ---------------------------------
|
||||||
Disabled **D**
|
Disabled **D**
|
||||||
------------- ----------- ---------------------------------
|
------------- ----------- ---------------------------------
|
||||||
|
@ -39,7 +39,7 @@ Example
|
|||||||
Scenario:
|
Scenario:
|
||||||
|
|
||||||
#. Recoll indexes a local filesystem mounted in ``/export/documents/reference``,
|
#. Recoll indexes a local filesystem mounted in ``/export/documents/reference``,
|
||||||
#. the Recoll search inteface can be reached at https://recoll.example.org/ and
|
#. the Recoll search interface can be reached at https://recoll.example.org/ and
|
||||||
#. the contents of this filesystem can be reached though https://download.example.org/reference
|
#. the contents of this filesystem can be reached though https://download.example.org/reference
|
||||||
|
|
||||||
.. code:: yaml
|
.. code:: yaml
|
||||||
|
@ -94,7 +94,7 @@ My experience is, that this command is a bit buggy.
|
|||||||
|
|
||||||
.. _uwsgi configuration:
|
.. _uwsgi configuration:
|
||||||
|
|
||||||
Alltogether
|
All together
|
||||||
===========
|
===========
|
||||||
|
|
||||||
Create the configuration ini-file according to your distribution (see below) and
|
Create the configuration ini-file according to your distribution (see below) and
|
||||||
|
@ -129,7 +129,7 @@ Global Settings
|
|||||||
outgoing: # communication with search engines
|
outgoing: # communication with search engines
|
||||||
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
||||||
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
||||||
useragent_suffix : "" # informations like an email address to the administrator
|
useragent_suffix : "" # information like an email address to the administrator
|
||||||
pool_connections : 100 # Number of different hosts
|
pool_connections : 100 # Number of different hosts
|
||||||
pool_maxsize : 10 # Number of simultaneous requests by host
|
pool_maxsize : 10 # Number of simultaneous requests by host
|
||||||
# uncomment below section if you want to use a proxy
|
# uncomment below section if you want to use a proxy
|
||||||
|
@ -207,7 +207,7 @@ debug services from filtron and morty analogous use:
|
|||||||
Another point we have to notice is that each service (:ref:`searx <searx.sh>`,
|
Another point we have to notice is that each service (:ref:`searx <searx.sh>`,
|
||||||
:ref:`filtron <filtron.sh>` and :ref:`morty <morty.sh>`) runs under dedicated
|
:ref:`filtron <filtron.sh>` and :ref:`morty <morty.sh>`) runs under dedicated
|
||||||
system user account with the same name (compare :ref:`create searx user`). To
|
system user account with the same name (compare :ref:`create searx user`). To
|
||||||
get a shell from theses accounts, simply call one of the scripts:
|
get a shell from these accounts, simply call one of the scripts:
|
||||||
|
|
||||||
.. tabs::
|
.. tabs::
|
||||||
|
|
||||||
@ -311,7 +311,7 @@ of the container:
|
|||||||
|
|
||||||
Now we can develop as usual in the working tree of our desktop system. Every
|
Now we can develop as usual in the working tree of our desktop system. Every
|
||||||
time the software was changed, you have to restart the searx service (in the
|
time the software was changed, you have to restart the searx service (in the
|
||||||
conatiner):
|
container):
|
||||||
|
|
||||||
.. tabs::
|
.. tabs::
|
||||||
|
|
||||||
@ -370,7 +370,7 @@ We build up a fully functional searx suite in a archlinux container:
|
|||||||
$ sudo -H ./utils/lxc.sh install suite searx-archlinux
|
$ sudo -H ./utils/lxc.sh install suite searx-archlinux
|
||||||
|
|
||||||
To access HTTP from the desktop we installed nginx for the services inside the
|
To access HTTP from the desktop we installed nginx for the services inside the
|
||||||
conatiner:
|
container:
|
||||||
|
|
||||||
.. tabs::
|
.. tabs::
|
||||||
|
|
||||||
|
@ -16,7 +16,7 @@ you can use your owm template by placing the template under
|
|||||||
``searx/templates/{theme_name}/result_templates/{template_name}`` and setting
|
``searx/templates/{theme_name}/result_templates/{template_name}`` and setting
|
||||||
``result_template`` attribute to ``{template_name}``.
|
``result_template`` attribute to ``{template_name}``.
|
||||||
|
|
||||||
Futhermore, if you do not want to expose these engines on a public instance, you can
|
Furthermore, if you do not want to expose these engines on a public instance, you can
|
||||||
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
||||||
private engines`_.
|
private engines`_.
|
||||||
|
|
||||||
@ -29,7 +29,7 @@ structure.
|
|||||||
Redis
|
Redis
|
||||||
-----
|
-----
|
||||||
|
|
||||||
Reqired package: ``redis``
|
Required package: ``redis``
|
||||||
|
|
||||||
Redis is a key value based data store usually stored in memory.
|
Redis is a key value based data store usually stored in memory.
|
||||||
|
|
||||||
|
@ -15,7 +15,7 @@ All of the engines above are added to ``settings.yml`` just commented out, as yo
|
|||||||
Please note that if you are not using HTTPS to access these engines, you have to enable
|
Please note that if you are not using HTTPS to access these engines, you have to enable
|
||||||
HTTP requests by setting ``enable_http`` to ``True``.
|
HTTP requests by setting ``enable_http`` to ``True``.
|
||||||
|
|
||||||
Futhermore, if you do not want to expose these engines on a public instance, you can
|
Furthermore, if you do not want to expose these engines on a public instance, you can
|
||||||
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
still add them and limit the access by setting ``tokens`` as described in the `blog post about
|
||||||
private engines`_.
|
private engines`_.
|
||||||
|
|
||||||
@ -57,7 +57,7 @@ small-scale (less than 10 million documents) data collections. E.g. it is great
|
|||||||
web pages you have visited and searching in the contents later.
|
web pages you have visited and searching in the contents later.
|
||||||
|
|
||||||
The engine supports faceted search, so you can search in a subset of documents of the collection.
|
The engine supports faceted search, so you can search in a subset of documents of the collection.
|
||||||
Futhermore, you can search in Meilisearch instances that require authentication by setting ``auth_token``.
|
Furthermore, you can search in Meilisearch instances that require authentication by setting ``auth_token``.
|
||||||
|
|
||||||
Here is a simple example to query a Meilisearch instance:
|
Here is a simple example to query a Meilisearch instance:
|
||||||
|
|
||||||
|
@ -62,7 +62,7 @@ Before enabling MySQL engine, you must install the package ``mysql-connector-pyt
|
|||||||
The authentication plugin is configurable by setting ``auth_plugin`` in the attributes.
|
The authentication plugin is configurable by setting ``auth_plugin`` in the attributes.
|
||||||
By default it is set to ``caching_sha2_password``.
|
By default it is set to ``caching_sha2_password``.
|
||||||
|
|
||||||
This is an example configuration for quering a MySQL server:
|
This is an example configuration for querying a MySQL server:
|
||||||
|
|
||||||
.. code:: yaml
|
.. code:: yaml
|
||||||
|
|
||||||
|
@ -41,7 +41,7 @@ engine file
|
|||||||
argument type information
|
argument type information
|
||||||
======================= =========== ========================================================
|
======================= =========== ========================================================
|
||||||
categories list pages, in which the engine is working
|
categories list pages, in which the engine is working
|
||||||
paging boolean support multible pages
|
paging boolean support multiple pages
|
||||||
time_range_support boolean support search time range
|
time_range_support boolean support search time range
|
||||||
engine_type str ``online`` by default, other possibles values are
|
engine_type str ``online`` by default, other possibles values are
|
||||||
``offline``, ``online_dictionary``, ``online_currency``
|
``offline``, ``online_dictionary``, ``online_currency``
|
||||||
@ -159,7 +159,7 @@ parsed arguments
|
|||||||
----------------
|
----------------
|
||||||
|
|
||||||
The function ``def request(query, params):`` always returns the ``params``
|
The function ``def request(query, params):`` always returns the ``params``
|
||||||
variable. Inside searx, the following paramters can be used to specify a search
|
variable. Inside searx, the following parameters can be used to specify a search
|
||||||
request:
|
request:
|
||||||
|
|
||||||
=================== =========== ==========================================================================
|
=================== =========== ==========================================================================
|
||||||
|
@ -15,7 +15,7 @@ generated and deployed at :docs:`github.io <.>`. For build prerequisites read
|
|||||||
:ref:`docs build`.
|
:ref:`docs build`.
|
||||||
|
|
||||||
The source files of Searx's documentation are located at :origin:`docs`. Sphinx
|
The source files of Searx's documentation are located at :origin:`docs`. Sphinx
|
||||||
assumes source files to be encoded in UTF-8 by defaul. Run :ref:`make docs.live
|
assumes source files to be encoded in UTF-8 by default. Run :ref:`make docs.live
|
||||||
<make docs.live>` to build HTML while editing.
|
<make docs.live>` to build HTML while editing.
|
||||||
|
|
||||||
.. sidebar:: Further reading
|
.. sidebar:: Further reading
|
||||||
@ -227,13 +227,13 @@ To refer anchors use the `ref role`_ markup:
|
|||||||
|
|
||||||
.. code:: reST
|
.. code:: reST
|
||||||
|
|
||||||
Visit chapter :ref:`reST anchor`. Or set hyperlink text manualy :ref:`foo
|
Visit chapter :ref:`reST anchor`. Or set hyperlink text manually :ref:`foo
|
||||||
bar <reST anchor>`.
|
bar <reST anchor>`.
|
||||||
|
|
||||||
.. admonition:: ``:ref:`` role
|
.. admonition:: ``:ref:`` role
|
||||||
:class: rst-example
|
:class: rst-example
|
||||||
|
|
||||||
Visist chapter :ref:`reST anchor`. Or set hyperlink text manualy :ref:`foo
|
Visist chapter :ref:`reST anchor`. Or set hyperlink text manually :ref:`foo
|
||||||
bar <reST anchor>`.
|
bar <reST anchor>`.
|
||||||
|
|
||||||
.. _reST ordinary ref:
|
.. _reST ordinary ref:
|
||||||
@ -494,8 +494,8 @@ Figures & Images
|
|||||||
is flexible. To get best results in the generated output format, install
|
is flexible. To get best results in the generated output format, install
|
||||||
ImageMagick_ and Graphviz_.
|
ImageMagick_ and Graphviz_.
|
||||||
|
|
||||||
Searx's sphinx setup includes: :ref:`linuxdoc:kfigure`. Scaleable here means;
|
Searx's sphinx setup includes: :ref:`linuxdoc:kfigure`. Scalable here means;
|
||||||
scaleable in sense of the build process. Normally in absence of a converter
|
scalable in sense of the build process. Normally in absence of a converter
|
||||||
tool, the build process will break. From the authors POV it’s annoying to care
|
tool, the build process will break. From the authors POV it’s annoying to care
|
||||||
about the build process when handling with images, especially since he has no
|
about the build process when handling with images, especially since he has no
|
||||||
access to the build process. With :ref:`linuxdoc:kfigure` the build process
|
access to the build process. With :ref:`linuxdoc:kfigure` the build process
|
||||||
@ -503,7 +503,7 @@ continues and scales output quality in dependence of installed image processors.
|
|||||||
|
|
||||||
If you want to add an image, you should use the ``kernel-figure`` (inheritance
|
If you want to add an image, you should use the ``kernel-figure`` (inheritance
|
||||||
of :dudir:`figure`) and ``kernel-image`` (inheritance of :dudir:`image`)
|
of :dudir:`figure`) and ``kernel-image`` (inheritance of :dudir:`image`)
|
||||||
directives. E.g. to insert a figure with a scaleable image format use SVG
|
directives. E.g. to insert a figure with a scalable image format use SVG
|
||||||
(:ref:`svg image example`):
|
(:ref:`svg image example`):
|
||||||
|
|
||||||
.. code:: reST
|
.. code:: reST
|
||||||
@ -1185,7 +1185,7 @@ and *targets* (e.g. a ref to :ref:`row 2 of table's body <row body 2>`).
|
|||||||
- cell 4.4
|
- cell 4.4
|
||||||
|
|
||||||
* - row 5
|
* - row 5
|
||||||
- cell 5.1 with automatic span to rigth end
|
- cell 5.1 with automatic span to right end
|
||||||
|
|
||||||
* - row 6
|
* - row 6
|
||||||
- cell 6.1
|
- cell 6.1
|
||||||
@ -1237,7 +1237,7 @@ and *targets* (e.g. a ref to :ref:`row 2 of table's body <row body 2>`).
|
|||||||
- cell 4.4
|
- cell 4.4
|
||||||
|
|
||||||
* - row 5
|
* - row 5
|
||||||
- cell 5.1 with automatic span to rigth end
|
- cell 5.1 with automatic span to right end
|
||||||
|
|
||||||
* - row 6
|
* - row 6
|
||||||
- cell 6.1
|
- cell 6.1
|
||||||
|
@ -17,7 +17,7 @@ Prefix: ``:``
|
|||||||
Prefix: ``?``
|
Prefix: ``?``
|
||||||
to add engines and categories to the currently selected categories
|
to add engines and categories to the currently selected categories
|
||||||
|
|
||||||
Abbrevations of the engines and languages are also accepted. Engine/category
|
Abbreviations of the engines and languages are also accepted. Engine/category
|
||||||
modifiers are chainable and inclusive (e.g. with :search:`!it !ddg !wp qwer
|
modifiers are chainable and inclusive (e.g. with :search:`!it !ddg !wp qwer
|
||||||
<?q=%21it%20%21ddg%20%21wp%20qwer>` search in IT category **and** duckduckgo
|
<?q=%21it%20%21ddg%20%21wp%20qwer>` search in IT category **and** duckduckgo
|
||||||
**and** wikipedia for ``qwer``).
|
**and** wikipedia for ``qwer``).
|
||||||
|
2
manage
2
manage
@ -192,7 +192,7 @@ docker.build() {
|
|||||||
# awk to remove the "v" and the "g"
|
# awk to remove the "v" and the "g"
|
||||||
SEARX_GIT_VERSION=$(git describe --tags | awk -F'-' '{OFS="-"; $1=substr($1, 2); if ($3) { $3=substr($3, 2); } print}')
|
SEARX_GIT_VERSION=$(git describe --tags | awk -F'-' '{OFS="-"; $1=substr($1, 2); if ($3) { $3=substr($3, 2); } print}')
|
||||||
|
|
||||||
# add the suffix "-dirty" if the repository has uncommited change
|
# add the suffix "-dirty" if the repository has uncommitted change
|
||||||
# /!\ HACK for searx/searx: ignore utils/brand.env
|
# /!\ HACK for searx/searx: ignore utils/brand.env
|
||||||
git update-index -q --refresh
|
git update-index -q --refresh
|
||||||
if [ ! -z "$(git diff-index --name-only HEAD -- | grep -v 'utils/brand.env')" ]; then
|
if [ ! -z "$(git diff-index --name-only HEAD -- | grep -v 'utils/brand.env')" ]; then
|
||||||
|
@ -41,11 +41,11 @@ if settings['ui']['static_path']:
|
|||||||
|
|
||||||
'''
|
'''
|
||||||
enable debug if
|
enable debug if
|
||||||
the environnement variable SEARX_DEBUG is 1 or true
|
the environment variable SEARX_DEBUG is 1 or true
|
||||||
(whatever the value in settings.yml)
|
(whatever the value in settings.yml)
|
||||||
or general.debug=True in settings.yml
|
or general.debug=True in settings.yml
|
||||||
disable debug if
|
disable debug if
|
||||||
the environnement variable SEARX_DEBUG is 0 or false
|
the environment variable SEARX_DEBUG is 0 or false
|
||||||
(whatever the value in settings.yml)
|
(whatever the value in settings.yml)
|
||||||
or general.debug=False in settings.yml
|
or general.debug=False in settings.yml
|
||||||
'''
|
'''
|
||||||
|
@ -142,7 +142,7 @@ def load_engine(engine_data):
|
|||||||
|
|
||||||
engine.stats = {
|
engine.stats = {
|
||||||
'sent_search_count': 0, # sent search
|
'sent_search_count': 0, # sent search
|
||||||
'search_count': 0, # succesful search
|
'search_count': 0, # successful search
|
||||||
'result_count': 0,
|
'result_count': 0,
|
||||||
'engine_time': 0,
|
'engine_time': 0,
|
||||||
'engine_time_count': 0,
|
'engine_time_count': 0,
|
||||||
@ -171,7 +171,7 @@ def load_engine(engine_data):
|
|||||||
categories.setdefault(category_name, []).append(engine)
|
categories.setdefault(category_name, []).append(engine)
|
||||||
|
|
||||||
if engine.shortcut in engine_shortcuts:
|
if engine.shortcut in engine_shortcuts:
|
||||||
logger.error('Engine config error: ambigious shortcut: {0}'.format(engine.shortcut))
|
logger.error('Engine config error: ambiguous shortcut: {0}'.format(engine.shortcut))
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
engine_shortcuts[engine.shortcut] = engine.name
|
engine_shortcuts[engine.shortcut] = engine.name
|
||||||
|
@ -70,7 +70,7 @@ def request(query, params):
|
|||||||
if params['time_range'] in time_range_dict:
|
if params['time_range'] in time_range_dict:
|
||||||
params['url'] += time_range_string.format(interval=time_range_dict[params['time_range']])
|
params['url'] += time_range_string.format(interval=time_range_dict[params['time_range']])
|
||||||
|
|
||||||
# bing videos did not like "older" versions < 70.0.1 when selectin other
|
# bing videos did not like "older" versions < 70.0.1 when selecting other
|
||||||
# languages then 'en' .. very strange ?!?!
|
# languages then 'en' .. very strange ?!?!
|
||||||
params['headers']['User-Agent'] = 'Mozilla/5.0 (X11; Linux x86_64; rv:73.0.1) Gecko/20100101 Firefox/73.0.1'
|
params['headers']['User-Agent'] = 'Mozilla/5.0 (X11; Linux x86_64; rv:73.0.1) Gecko/20100101 Firefox/73.0.1'
|
||||||
|
|
||||||
|
@ -80,7 +80,7 @@ def response(resp):
|
|||||||
# * book / performing art / film / television / media franchise / concert tour / playwright
|
# * book / performing art / film / television / media franchise / concert tour / playwright
|
||||||
# * prepared food
|
# * prepared food
|
||||||
# * website / software / os / programming language / file format / software engineer
|
# * website / software / os / programming language / file format / software engineer
|
||||||
# * compagny
|
# * company
|
||||||
|
|
||||||
content = ''
|
content = ''
|
||||||
heading = search_res.get('Heading', '')
|
heading = search_res.get('Heading', '')
|
||||||
|
@ -40,7 +40,7 @@ def response(resp):
|
|||||||
|
|
||||||
search_res = loads(resp.text)
|
search_res = loads(resp.text)
|
||||||
|
|
||||||
# check if items are recieved
|
# check if items are received
|
||||||
if 'items' not in search_res:
|
if 'items' not in search_res:
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
@ -282,7 +282,7 @@ def response(resp):
|
|||||||
|
|
||||||
# google *sections*
|
# google *sections*
|
||||||
if extract_text(eval_xpath(result, g_section_with_header)):
|
if extract_text(eval_xpath(result, g_section_with_header)):
|
||||||
logger.debug("ingoring <g-section-with-header>")
|
logger.debug("ignoring <g-section-with-header>")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -2,7 +2,7 @@
|
|||||||
"""Google (News)
|
"""Google (News)
|
||||||
|
|
||||||
For detailed description of the *REST-full* API see: `Query Parameter
|
For detailed description of the *REST-full* API see: `Query Parameter
|
||||||
Definitions`_. Not all parameters can be appied:
|
Definitions`_. Not all parameters can be applied:
|
||||||
|
|
||||||
- num_ : the number of search results is ignored
|
- num_ : the number of search results is ignored
|
||||||
- save_ : is ignored / Google-News results are always *SafeSearch*
|
- save_ : is ignored / Google-News results are always *SafeSearch*
|
||||||
@ -155,7 +155,7 @@ def response(resp):
|
|||||||
padding = (4 -(len(jslog) % 4)) * "="
|
padding = (4 -(len(jslog) % 4)) * "="
|
||||||
jslog = b64decode(jslog + padding)
|
jslog = b64decode(jslog + padding)
|
||||||
except binascii.Error:
|
except binascii.Error:
|
||||||
# URL cant be read, skip this result
|
# URL can't be read, skip this result
|
||||||
continue
|
continue
|
||||||
|
|
||||||
# now we have : b'[null, ... null,"https://www.cnn.com/.../index.html"]'
|
# now we have : b'[null, ... null,"https://www.cnn.com/.../index.html"]'
|
||||||
|
@ -2,7 +2,7 @@
|
|||||||
"""Google (Video)
|
"""Google (Video)
|
||||||
|
|
||||||
For detailed description of the *REST-full* API see: `Query Parameter
|
For detailed description of the *REST-full* API see: `Query Parameter
|
||||||
Definitions`_. Not all parameters can be appied.
|
Definitions`_. Not all parameters can be applied.
|
||||||
|
|
||||||
.. _admonition:: Content-Security-Policy (CSP)
|
.. _admonition:: Content-Security-Policy (CSP)
|
||||||
|
|
||||||
@ -163,7 +163,7 @@ def response(resp):
|
|||||||
|
|
||||||
# google *sections*
|
# google *sections*
|
||||||
if extract_text(eval_xpath(result, g_section_with_header)):
|
if extract_text(eval_xpath(result, g_section_with_header)):
|
||||||
logger.debug("ingoring <g-section-with-header>")
|
logger.debug("ignoring <g-section-with-header>")
|
||||||
continue
|
continue
|
||||||
|
|
||||||
title = extract_text(eval_xpath_getindex(result, title_xpath, 0))
|
title = extract_text(eval_xpath_getindex(result, title_xpath, 0))
|
||||||
|
@ -72,7 +72,7 @@ def response(resp):
|
|||||||
elif properties.get('osm_type') == 'R':
|
elif properties.get('osm_type') == 'R':
|
||||||
osm_type = 'relation'
|
osm_type = 'relation'
|
||||||
else:
|
else:
|
||||||
# continue if invalide osm-type
|
# continue if invalid osm-type
|
||||||
continue
|
continue
|
||||||
|
|
||||||
url = result_base_url.format(osm_type=osm_type,
|
url = result_base_url.format(osm_type=osm_type,
|
||||||
|
@ -71,7 +71,7 @@ def response(resp):
|
|||||||
if 'downloadUrl' in result:
|
if 'downloadUrl' in result:
|
||||||
new_result['torrentfile'] = result['downloadUrl']
|
new_result['torrentfile'] = result['downloadUrl']
|
||||||
|
|
||||||
# magnet link *may* be in guid, but it may be also idential to infoUrl
|
# magnet link *may* be in guid, but it may be also identical to infoUrl
|
||||||
if 'guid' in result and isinstance(result['guid'], str) and result['guid'].startswith('magnet'):
|
if 'guid' in result and isinstance(result['guid'], str) and result['guid'].startswith('magnet'):
|
||||||
new_result['magnetlink'] = result['guid']
|
new_result['magnetlink'] = result['guid']
|
||||||
|
|
||||||
|
@ -51,7 +51,7 @@ search_url = base_url + 'sp/search?'
|
|||||||
|
|
||||||
# specific xpath variables
|
# specific xpath variables
|
||||||
# ads xpath //div[@id="results"]/div[@id="sponsored"]//div[@class="result"]
|
# ads xpath //div[@id="results"]/div[@id="sponsored"]//div[@class="result"]
|
||||||
# not ads: div[@class="result"] are the direct childs of div[@id="results"]
|
# not ads: div[@class="result"] are the direct children of div[@id="results"]
|
||||||
results_xpath = '//div[@class="w-gl__result__main"]'
|
results_xpath = '//div[@class="w-gl__result__main"]'
|
||||||
link_xpath = './/a[@class="w-gl__result-title result-link"]'
|
link_xpath = './/a[@class="w-gl__result-title result-link"]'
|
||||||
content_xpath = './/p[@class="w-gl__description"]'
|
content_xpath = './/p[@class="w-gl__description"]'
|
||||||
@ -216,7 +216,7 @@ def _fetch_supported_languages(resp):
|
|||||||
# native name, the English name of the writing script used by the language,
|
# native name, the English name of the writing script used by the language,
|
||||||
# or occasionally something else entirely.
|
# or occasionally something else entirely.
|
||||||
|
|
||||||
# this cases are so special they need to be hardcoded, a couple of them are mispellings
|
# this cases are so special they need to be hardcoded, a couple of them are misspellings
|
||||||
language_names = {
|
language_names = {
|
||||||
'english_uk': 'en-GB',
|
'english_uk': 'en-GB',
|
||||||
'fantizhengwen': ['zh-TW', 'zh-HK'],
|
'fantizhengwen': ['zh-TW', 'zh-HK'],
|
||||||
|
@ -49,7 +49,7 @@ WIKIDATA_PROPERTIES = {
|
|||||||
# SERVICE wikibase:label: https://en.wikibooks.org/wiki/SPARQL/SERVICE_-_Label#Manual_Label_SERVICE
|
# SERVICE wikibase:label: https://en.wikibooks.org/wiki/SPARQL/SERVICE_-_Label#Manual_Label_SERVICE
|
||||||
# https://en.wikibooks.org/wiki/SPARQL/WIKIDATA_Precision,_Units_and_Coordinates
|
# https://en.wikibooks.org/wiki/SPARQL/WIKIDATA_Precision,_Units_and_Coordinates
|
||||||
# https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Data_model
|
# https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#Data_model
|
||||||
# optmization:
|
# optimization:
|
||||||
# * https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/query_optimization
|
# * https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/query_optimization
|
||||||
# * https://github.com/blazegraph/database/wiki/QueryHints
|
# * https://github.com/blazegraph/database/wiki/QueryHints
|
||||||
QUERY_TEMPLATE = """
|
QUERY_TEMPLATE = """
|
||||||
@ -335,7 +335,7 @@ def get_attributes(language):
|
|||||||
add_amount('P2046') # area
|
add_amount('P2046') # area
|
||||||
add_amount('P281') # postal code
|
add_amount('P281') # postal code
|
||||||
add_label('P38') # currency
|
add_label('P38') # currency
|
||||||
add_amount('P2048') # heigth (building)
|
add_amount('P2048') # height (building)
|
||||||
|
|
||||||
# Media
|
# Media
|
||||||
for p in ['P400', # platform (videogames, computing)
|
for p in ['P400', # platform (videogames, computing)
|
||||||
|
@ -50,7 +50,7 @@ def request(query, params):
|
|||||||
|
|
||||||
# replace private user area characters to make text legible
|
# replace private user area characters to make text legible
|
||||||
def replace_pua_chars(text):
|
def replace_pua_chars(text):
|
||||||
pua_chars = {'\uf522': '\u2192', # rigth arrow
|
pua_chars = {'\uf522': '\u2192', # right arrow
|
||||||
'\uf7b1': '\u2115', # set of natural numbers
|
'\uf7b1': '\u2115', # set of natural numbers
|
||||||
'\uf7b4': '\u211a', # set of rational numbers
|
'\uf7b4': '\u211a', # set of rational numbers
|
||||||
'\uf7b5': '\u211d', # set of real numbers
|
'\uf7b5': '\u211d', # set of real numbers
|
||||||
|
@ -35,7 +35,7 @@ time_range_support = False
|
|||||||
|
|
||||||
time_range_url = '&hours={time_range_val}'
|
time_range_url = '&hours={time_range_val}'
|
||||||
'''Time range URL parameter in the in :py:obj:`search_url`. If no time range is
|
'''Time range URL parameter in the in :py:obj:`search_url`. If no time range is
|
||||||
requested by the user, the URL paramter is an empty string. The
|
requested by the user, the URL parameter is an empty string. The
|
||||||
``{time_range_val}`` replacement is taken from the :py:obj:`time_range_map`.
|
``{time_range_val}`` replacement is taken from the :py:obj:`time_range_map`.
|
||||||
|
|
||||||
.. code:: yaml
|
.. code:: yaml
|
||||||
|
@ -30,7 +30,7 @@ def get_external_url(url_id, item_id, alternative="default"):
|
|||||||
"""Return an external URL or None if url_id is not found.
|
"""Return an external URL or None if url_id is not found.
|
||||||
|
|
||||||
url_id can take value from data/external_urls.json
|
url_id can take value from data/external_urls.json
|
||||||
The "imdb_id" value is automaticaly converted according to the item_id value.
|
The "imdb_id" value is automatically converted according to the item_id value.
|
||||||
|
|
||||||
If item_id is None, the raw URL with the $1 is returned.
|
If item_id is None, the raw URL with the $1 is returned.
|
||||||
"""
|
"""
|
||||||
|
@ -78,7 +78,7 @@ def load_single_https_ruleset(rules_path):
|
|||||||
rules = []
|
rules = []
|
||||||
exclusions = []
|
exclusions = []
|
||||||
|
|
||||||
# parse childs from ruleset
|
# parse children from ruleset
|
||||||
for ruleset in root:
|
for ruleset in root:
|
||||||
# this child define a target
|
# this child define a target
|
||||||
if ruleset.tag == 'target':
|
if ruleset.tag == 'target':
|
||||||
|
@ -2435,7 +2435,7 @@
|
|||||||
<rule from="^http://widgets\.yahoo\.com/[^?]*"
|
<rule from="^http://widgets\.yahoo\.com/[^?]*"
|
||||||
to="https://www.yahoo.com/" />
|
to="https://www.yahoo.com/" />
|
||||||
|
|
||||||
<rule from="^http://((?:\w\w|fr-ca\.actualites|address|\w\w\.address|admanager|(?:\w\w|global)\.adserver|adspecs|\w+\.adspecs|\w+\.adspecs-new|advertising|\w\w\.advertising|beap\.adx|c5a?\.ah|(?:s-)?cookex\.amp|(?:[aosz]|apac|y3?)\.analytics|anc|answers|(?:\w\w|espanol|malaysia)\.answers|antispam|\w\w\.antispam|vn\.antoan|au\.apps|global\.ard|astrology|\w\w\.astrology|hk\.(?:(?:info|f1\.master|f1\.page|search|store|edit\.store|user)\.)?auctions|autos|\w\w\.autos|ar\.ayuda|(?:clicks\.beap|csc\.beap|pn1|row|us)\.bc|tw\.bid|tw\.(?:campaign|master|mb|page|search|store|user)\.bid|(?:m\.)?tw\.bigdeals|tw\.billing|biz|boss|(?:tw\.partner|tw)\.buy|(?:\w\w\.)?calendar|careers|\w\w\.cars|(?:\w\w|es-us)\.celebridades|(?:\w\w\.)?celebrity|tw\.charity|i?chart|(?:\w\w|es-us)\.cine|\w\w\.cinema|(?:\w\w|es-us)\.clima|migration\.cn|(?:deveopers\.)?commercecentral|br\.contribuidores|(?:uk\.)?contributor|au\.dating|(?:\w\w|es-us)\.deportes|developer|tw\.dictionary|dir|downloads|s-b\.dp|(?:eu\.|na\.|sa\.|tw\.)?edit|tw\.(?:ysm\.)?emarketing|en-maktoob|\w\w\.entertainment|espanol|edit\.europe|eurosport|(?:de|es|it|uk)\.eurosport|everything|\w\w\.everything|\w+\.fantasysports|au\.fango|tw\.fashion|br\.financas|finance|(?:\w\w|tw\.chart|espanol|tw\.futures|streamerapi)\.finance|(?:\w\w|es-us)\.finanzas|nz\.rss\.food|nz\.forums|games|(?:au|ca|uk)\.games|geo|gma|groups|(?:\w\w|asia|espanol|es-us|fr-ca|moderators)\.groups|health|help|(?:\w\w|secure)\.help|homes|(?:tw|tw\.v2)\.house|info|\w\w\.info|tw\.tool\.ks|au\.launch|legalredirect|(?:\w\w)\.lifestyle|(?:gh\.bouncer\.)?login|us\.l?rd|local|\w\w\.local|m|r\.m|\w\w\.m|mail|(?:\w\w\.overview|[\w-]+(?:\.c\.yom)?)\.mail|maktoob|malaysia|tw\.(?:user\.)?mall|maps|(?:\w\w|espanol|sgws2)\.maps|messenger|(?:\w\w|malaysia)\.messenger|\w\w\.meteo|mlogin|mobile|(?:\w\w|espanol|malaysia)\.mobile|tw\.(?:campaign\.)?money|tw\.movie|movies|(?:au|ca|nz|au\.rss|nz\.rss|tw|uk)\.movies|[\w.-]+\.msg|(?:\w\w|es-us)\.mujer|music|ca\.music|[\w-]+\.musica|my|us\.my|de\.nachrichten|ucs\.netsvs|news|(?:au|ca|fr|gr|hk|in|nz|ph|nz\.rss|sg|tw|uk)\.news|cookiex\.ngd|(?:\w\w|es-us)\.noticias|omg|(?:\w\w|es-us)\.omg|au\.oztips|rtb\.pclick|pilotx1|pipes|play|playerio|privacy|profile|tw\.promo|(?:au|hk|nz)\.promotions|publishing|(?:analytics|mailapps|media|ucs|us-locdrop|video)\.query|hk\.rd|(?:\w\w\.|fr-ca\.)?safely|screen|(?:\w\w|es-us)\.screen|scribe|search|(?:\w\w|w\w\.blog|\w\w\.dictionary|finance|\w\w\.finance|images|\w\w\.images|\w\w\.knowledge|\w\w\.lifestyle|\w\w\.local|malaysia|movies|\w\w\.movies|news|\w\w\.news|malaysia\.news|r|recipes|\w\w\.recipes|shine|shopping|\w\w\.shopping|sports|\w\w\.sports|tools|au\.tv|video|\w\w\.video|malaysia\.video)\.search|sec|rtb\.pclick\.secure|security|tw\.security|\w\w\.seguranca|\w\w\.seguridad|es-us\.seguridad|\w\w\.seguro|tw\.serviceplus|settings|shine|ca\.shine|shopping|ca\.shopping|\w+\.sitios|dashboard\.slingstone|(?:au\.|order\.)?smallbusiness|smarttv|rd\.software|de\.spiele|sports|(?:au|ca|fr|hk|nz|ph|profiles|au\.rss|nz\.rss|tw)\.sports|tw\.stock|au\.thehype|\w\w\.tiempo|es\.todo|toolbar|(?:\w\w|data|malaysia)\.toolbar|(?:au|nz)\.totaltravel|transparency|travel|tw\.travel||tv|(?:ar|au|de|fr|es|es-us|it|mx|nz|au\.rss|uk)\.tv|tw\.uwant|(?:mh|nz|qos|yep)\.video|weather|(?:au|ca|hk|in|nz|sg|ph|uk|us)\.weather|de\.wetter|www|au\.yel|video\.media\.yql|dmros\.ysm)\.)?yahoo\.com/"
|
<rule from="^http://((?:\w\w|fr-ca\.actualites|address|\w\w\.address|admanager|(?:\w\w|global)\.adserver|adspecs|\w+\.adspecs|\w+\.adspecs-new|advertising|\w\w\.advertising|beap\.adx|c5a?\.ah|(?:s-)?cookex\.amp|(?:[aosz]|apac|y3?)\.analytics|anc|answers|(?:\w\w|espanol|malaysia)\.answers|antispam|\w\w\.antispam|vn\.antoan|au\.apps|global\.ard|astrology|\w\w\.astrology|hk\.(?:(?:info|f1\.master|f1\.page|search|store|edit\.store|user)\.)?auctions|autos|\w\w\.autos|ar\.ayuda|(?:clicks\.beap|csc\.beap|pn1|row|us)\.bc|tw\.bid|tw\.(?:campaign|master|mb|page|search|store|user)\.bid|(?:m\.)?tw\.bigdeals|tw\.billing|biz|boss|(?:tw\.partner|tw)\.buy|(?:\w\w\.)?calendar|careers|\w\w\.cars|(?:\w\w|es-us)\.celebridades|(?:\w\w\.)?celebrity|tw\.charity|i?chart|(?:\w\w|es-us)\.cine|\w\w\.cinema|(?:\w\w|es-us)\.clima|migration\.cn|(?:developers\.)?commercecentral|br\.contribuidores|(?:uk\.)?contributor|au\.dating|(?:\w\w|es-us)\.deportes|developer|tw\.dictionary|dir|downloads|s-b\.dp|(?:eu\.|na\.|sa\.|tw\.)?edit|tw\.(?:ysm\.)?emarketing|en-maktoob|\w\w\.entertainment|espanol|edit\.europe|eurosport|(?:de|es|it|uk)\.eurosport|everything|\w\w\.everything|\w+\.fantasysports|au\.fango|tw\.fashion|br\.financas|finance|(?:\w\w|tw\.chart|espanol|tw\.futures|streamerapi)\.finance|(?:\w\w|es-us)\.finanzas|nz\.rss\.food|nz\.forums|games|(?:au|ca|uk)\.games|geo|gma|groups|(?:\w\w|asia|espanol|es-us|fr-ca|moderators)\.groups|health|help|(?:\w\w|secure)\.help|homes|(?:tw|tw\.v2)\.house|info|\w\w\.info|tw\.tool\.ks|au\.launch|legalredirect|(?:\w\w)\.lifestyle|(?:gh\.bouncer\.)?login|us\.l?rd|local|\w\w\.local|m|r\.m|\w\w\.m|mail|(?:\w\w\.overview|[\w-]+(?:\.c\.yom)?)\.mail|maktoob|malaysia|tw\.(?:user\.)?mall|maps|(?:\w\w|espanol|sgws2)\.maps|messenger|(?:\w\w|malaysia)\.messenger|\w\w\.meteo|mlogin|mobile|(?:\w\w|espanol|malaysia)\.mobile|tw\.(?:campaign\.)?money|tw\.movie|movies|(?:au|ca|nz|au\.rss|nz\.rss|tw|uk)\.movies|[\w.-]+\.msg|(?:\w\w|es-us)\.mujer|music|ca\.music|[\w-]+\.musica|my|us\.my|de\.nachrichten|ucs\.netsvs|news|(?:au|ca|fr|gr|hk|in|nz|ph|nz\.rss|sg|tw|uk)\.news|cookiex\.ngd|(?:\w\w|es-us)\.noticias|omg|(?:\w\w|es-us)\.omg|au\.oztips|rtb\.pclick|pilotx1|pipes|play|playerio|privacy|profile|tw\.promo|(?:au|hk|nz)\.promotions|publishing|(?:analytics|mailapps|media|ucs|us-locdrop|video)\.query|hk\.rd|(?:\w\w\.|fr-ca\.)?safely|screen|(?:\w\w|es-us)\.screen|scribe|search|(?:\w\w|w\w\.blog|\w\w\.dictionary|finance|\w\w\.finance|images|\w\w\.images|\w\w\.knowledge|\w\w\.lifestyle|\w\w\.local|malaysia|movies|\w\w\.movies|news|\w\w\.news|malaysia\.news|r|recipes|\w\w\.recipes|shine|shopping|\w\w\.shopping|sports|\w\w\.sports|tools|au\.tv|video|\w\w\.video|malaysia\.video)\.search|sec|rtb\.pclick\.secure|security|tw\.security|\w\w\.seguranca|\w\w\.seguridad|es-us\.seguridad|\w\w\.seguro|tw\.serviceplus|settings|shine|ca\.shine|shopping|ca\.shopping|\w+\.sitios|dashboard\.slingstone|(?:au\.|order\.)?smallbusiness|smarttv|rd\.software|de\.spiele|sports|(?:au|ca|fr|hk|nz|ph|profiles|au\.rss|nz\.rss|tw)\.sports|tw\.stock|au\.thehype|\w\w\.tiempo|es\.todo|toolbar|(?:\w\w|data|malaysia)\.toolbar|(?:au|nz)\.totaltravel|transparency|travel|tw\.travel||tv|(?:ar|au|de|fr|es|es-us|it|mx|nz|au\.rss|uk)\.tv|tw\.uwant|(?:mh|nz|qos|yep)\.video|weather|(?:au|ca|hk|in|nz|sg|ph|uk|us)\.weather|de\.wetter|www|au\.yel|video\.media\.yql|dmros\.ysm)\.)?yahoo\.com/"
|
||||||
to="https://$1yahoo.com/" />
|
to="https://$1yahoo.com/" />
|
||||||
|
|
||||||
<rule from="^http://([\w-]+)\.yahoofs\.com/"
|
<rule from="^http://([\w-]+)\.yahoofs\.com/"
|
||||||
|
@ -97,7 +97,7 @@ class SessionSinglePool(requests.Session):
|
|||||||
self.mount('http://', http_adapter)
|
self.mount('http://', http_adapter)
|
||||||
|
|
||||||
def close(self):
|
def close(self):
|
||||||
"""Call super, but clear adapters since there are managed globaly"""
|
"""Call super, but clear adapters since there are managed globally"""
|
||||||
self.adapters.clear()
|
self.adapters.clear()
|
||||||
super().close()
|
super().close()
|
||||||
|
|
||||||
|
@ -62,7 +62,7 @@ class Setting:
|
|||||||
return self.value
|
return self.value
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
|
|
||||||
If needed, its overwritten in the inheritance."""
|
If needed, its overwritten in the inheritance."""
|
||||||
resp.set_cookie(name, self.value, max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, self.value, max_age=COOKIE_MAX_AGE)
|
||||||
@ -125,7 +125,7 @@ class MultipleChoiceSetting(EnumStringSetting):
|
|||||||
self.value.append(choice)
|
self.value.append(choice)
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
"""
|
"""
|
||||||
resp.set_cookie(name, ','.join(self.value), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, ','.join(self.value), max_age=COOKIE_MAX_AGE)
|
||||||
|
|
||||||
@ -160,7 +160,7 @@ class SetSetting(Setting):
|
|||||||
self.values = set(elements) # pylint: disable=attribute-defined-outside-init
|
self.values = set(elements) # pylint: disable=attribute-defined-outside-init
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
"""
|
"""
|
||||||
resp.set_cookie(name, ','.join(self.values), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, ','.join(self.values), max_age=COOKIE_MAX_AGE)
|
||||||
|
|
||||||
@ -209,7 +209,7 @@ class MapSetting(Setting):
|
|||||||
self.key = data # pylint: disable=attribute-defined-outside-init
|
self.key = data # pylint: disable=attribute-defined-outside-init
|
||||||
|
|
||||||
def save(self, name, resp):
|
def save(self, name, resp):
|
||||||
"""Save cookie ``name`` in the HTTP reponse obect
|
"""Save cookie ``name`` in the HTTP response object
|
||||||
"""
|
"""
|
||||||
if hasattr(self, 'key'):
|
if hasattr(self, 'key'):
|
||||||
resp.set_cookie(name, self.key, max_age=COOKIE_MAX_AGE)
|
resp.set_cookie(name, self.key, max_age=COOKIE_MAX_AGE)
|
||||||
@ -253,7 +253,7 @@ class SwitchableSetting(Setting):
|
|||||||
self.enabled.add(choice['id'])
|
self.enabled.add(choice['id'])
|
||||||
|
|
||||||
def save(self, resp): # pylint: disable=arguments-differ
|
def save(self, resp): # pylint: disable=arguments-differ
|
||||||
"""Save cookie in the HTTP reponse obect
|
"""Save cookie in the HTTP response object
|
||||||
"""
|
"""
|
||||||
resp.set_cookie('disabled_{0}'.format(self.value), ','.join(self.disabled), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie('disabled_{0}'.format(self.value), ','.join(self.disabled), max_age=COOKIE_MAX_AGE)
|
||||||
resp.set_cookie('enabled_{0}'.format(self.value), ','.join(self.enabled), max_age=COOKIE_MAX_AGE)
|
resp.set_cookie('enabled_{0}'.format(self.value), ','.join(self.enabled), max_age=COOKIE_MAX_AGE)
|
||||||
@ -517,7 +517,7 @@ class Preferences:
|
|||||||
return ret_val
|
return ret_val
|
||||||
|
|
||||||
def save(self, resp):
|
def save(self, resp):
|
||||||
"""Save cookie in the HTTP reponse obect
|
"""Save cookie in the HTTP response object
|
||||||
"""
|
"""
|
||||||
for user_setting_name, user_setting in self.key_value_settings.items():
|
for user_setting_name, user_setting in self.key_value_settings.items():
|
||||||
if user_setting.locked:
|
if user_setting.locked:
|
||||||
|
@ -197,10 +197,10 @@ class BangParser(QueryPartParser):
|
|||||||
self.raw_text_query.enginerefs.append(EngineRef(value, 'none'))
|
self.raw_text_query.enginerefs.append(EngineRef(value, 'none'))
|
||||||
return True
|
return True
|
||||||
|
|
||||||
# check if prefix is equal with categorie name
|
# check if prefix is equal with category name
|
||||||
if value in categories:
|
if value in categories:
|
||||||
# using all engines for that search, which
|
# using all engines for that search, which
|
||||||
# are declared under that categorie name
|
# are declared under that category name
|
||||||
self.raw_text_query.enginerefs.extend(EngineRef(engine.name, value)
|
self.raw_text_query.enginerefs.extend(EngineRef(engine.name, value)
|
||||||
for engine in categories[value]
|
for engine in categories[value]
|
||||||
if (engine.name, value) not in self.raw_text_query.disabled_engines)
|
if (engine.name, value) not in self.raw_text_query.disabled_engines)
|
||||||
@ -216,7 +216,7 @@ class BangParser(QueryPartParser):
|
|||||||
self._add_autocomplete(first_char + suggestion)
|
self._add_autocomplete(first_char + suggestion)
|
||||||
return
|
return
|
||||||
|
|
||||||
# check if query starts with categorie name
|
# check if query starts with category name
|
||||||
for category in categories:
|
for category in categories:
|
||||||
if category.startswith(value):
|
if category.startswith(value):
|
||||||
self._add_autocomplete(first_char + category)
|
self._add_autocomplete(first_char + category)
|
||||||
@ -309,7 +309,7 @@ class RawTextQuery:
|
|||||||
|
|
||||||
def getFullQuery(self):
|
def getFullQuery(self):
|
||||||
"""
|
"""
|
||||||
get full querry including whitespaces
|
get full query including whitespaces
|
||||||
"""
|
"""
|
||||||
return '{0} {1}'.format(' '.join(self.query_parts), self.getQuery()).strip()
|
return '{0} {1}'.format(' '.join(self.query_parts), self.getQuery()).strip()
|
||||||
|
|
||||||
|
@ -143,9 +143,9 @@ def result_score(result, language):
|
|||||||
if language in domain_parts:
|
if language in domain_parts:
|
||||||
weight *= 1.1
|
weight *= 1.1
|
||||||
|
|
||||||
occurences = len(result['positions'])
|
occurrences = len(result['positions'])
|
||||||
|
|
||||||
return sum((occurences * weight) / position for position in result['positions'])
|
return sum((occurrences * weight) / position for position in result['positions'])
|
||||||
|
|
||||||
|
|
||||||
class ResultContainer:
|
class ResultContainer:
|
||||||
@ -252,7 +252,7 @@ class ResultContainer:
|
|||||||
|
|
||||||
result['engines'] = set([result['engine']])
|
result['engines'] = set([result['engine']])
|
||||||
|
|
||||||
# strip multiple spaces and cariage returns from content
|
# strip multiple spaces and carriage returns from content
|
||||||
if result.get('content'):
|
if result.get('content'):
|
||||||
result['content'] = WHITESPACE_REGEX.sub(' ', result['content'])
|
result['content'] = WHITESPACE_REGEX.sub(' ', result['content'])
|
||||||
|
|
||||||
@ -278,7 +278,7 @@ class ResultContainer:
|
|||||||
return merged_result
|
return merged_result
|
||||||
else:
|
else:
|
||||||
# it's an image
|
# it's an image
|
||||||
# it's a duplicate if the parsed_url, template and img_src are differents
|
# it's a duplicate if the parsed_url, template and img_src are different
|
||||||
if result.get('img_src', '') == merged_result.get('img_src', ''):
|
if result.get('img_src', '') == merged_result.get('img_src', ''):
|
||||||
return merged_result
|
return merged_result
|
||||||
return None
|
return None
|
||||||
|
@ -60,7 +60,7 @@ def run(engine_name_list, verbose):
|
|||||||
stderr.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}Checking\n')
|
stderr.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}Checking\n')
|
||||||
checker = searx.search.checker.Checker(processor)
|
checker = searx.search.checker.Checker(processor)
|
||||||
checker.run()
|
checker.run()
|
||||||
if checker.test_results.succesfull:
|
if checker.test_results.successful:
|
||||||
stdout.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}{GREEN}OK{RESET_SEQ}\n')
|
stdout.write(f'{BOLD_SEQ}Engine {name:30}{RESET_SEQ}{GREEN}OK{RESET_SEQ}\n')
|
||||||
if verbose:
|
if verbose:
|
||||||
stdout.write(f' {"found languages":15}: {" ".join(sorted(list(checker.test_results.languages)))}\n')
|
stdout.write(f' {"found languages":15}: {" ".join(sorted(list(checker.test_results.languages)))}\n')
|
||||||
|
@ -59,7 +59,7 @@ def run():
|
|||||||
logger.debug('Checking %s engine', name)
|
logger.debug('Checking %s engine', name)
|
||||||
checker = Checker(processor)
|
checker = Checker(processor)
|
||||||
checker.run()
|
checker.run()
|
||||||
if checker.test_results.succesfull:
|
if checker.test_results.successful:
|
||||||
result['engines'][name] = {'success': True}
|
result['engines'][name] = {'success': True}
|
||||||
else:
|
else:
|
||||||
result['engines'][name] = {'success': False, 'errors': checker.test_results.errors}
|
result['engines'][name] = {'success': False, 'errors': checker.test_results.errors}
|
||||||
|
@ -146,7 +146,7 @@ class TestResults:
|
|||||||
self.languages.add(language)
|
self.languages.add(language)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def succesfull(self):
|
def successful(self):
|
||||||
return len(self.errors) == 0
|
return len(self.errors) == 0
|
||||||
|
|
||||||
def __iter__(self):
|
def __iter__(self):
|
||||||
@ -291,7 +291,7 @@ class ResultContainerTests:
|
|||||||
self._record_error('No result')
|
self._record_error('No result')
|
||||||
|
|
||||||
def one_title_contains(self, title: str):
|
def one_title_contains(self, title: str):
|
||||||
"""Check one of the title contains `title` (case insensitive comparaison)"""
|
"""Check one of the title contains `title` (case insensitive comparison)"""
|
||||||
title = title.lower()
|
title = title.lower()
|
||||||
for result in self.result_container.get_ordered_results():
|
for result in self.result_container.get_ordered_results():
|
||||||
if title in result['title'].lower():
|
if title in result['title'].lower():
|
||||||
|
@ -56,7 +56,7 @@ class OnlineProcessor(EngineProcessor):
|
|||||||
|
|
||||||
def _send_http_request(self, params):
|
def _send_http_request(self, params):
|
||||||
# create dictionary which contain all
|
# create dictionary which contain all
|
||||||
# informations about the request
|
# information about the request
|
||||||
request_args = dict(
|
request_args = dict(
|
||||||
headers=params['headers'],
|
headers=params['headers'],
|
||||||
cookies=params['cookies'],
|
cookies=params['cookies'],
|
||||||
|
@ -19,7 +19,7 @@ search:
|
|||||||
default_lang : "" # Default search language - leave blank to detect from browser information or use codes from 'languages.py'
|
default_lang : "" # Default search language - leave blank to detect from browser information or use codes from 'languages.py'
|
||||||
ban_time_on_fail : 5 # ban time in seconds after engine errors
|
ban_time_on_fail : 5 # ban time in seconds after engine errors
|
||||||
max_ban_time_on_fail : 120 # max ban time in seconds after engine errors
|
max_ban_time_on_fail : 120 # max ban time in seconds after engine errors
|
||||||
prefer_configured_language: False # increase weight of results in confiugred language in ranking
|
prefer_configured_language: False # increase weight of results in configured language in ranking
|
||||||
|
|
||||||
server:
|
server:
|
||||||
port : 8888
|
port : 8888
|
||||||
@ -73,7 +73,7 @@ ui:
|
|||||||
outgoing: # communication with search engines
|
outgoing: # communication with search engines
|
||||||
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
request_timeout : 2.0 # default timeout in seconds, can be override by engine
|
||||||
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
# max_request_timeout: 10.0 # the maximum timeout in seconds
|
||||||
useragent_suffix : "" # suffix of searx_useragent, could contain informations like an email address to the administrator
|
useragent_suffix : "" # suffix of searx_useragent, could contain information like an email address to the administrator
|
||||||
pool_connections : 100 # Number of different hosts
|
pool_connections : 100 # Number of different hosts
|
||||||
pool_maxsize : 10 # Number of simultaneous requests by host
|
pool_maxsize : 10 # Number of simultaneous requests by host
|
||||||
# uncomment below section if you want to use a proxy
|
# uncomment below section if you want to use a proxy
|
||||||
|
@ -37,7 +37,7 @@ def get_user_settings_path():
|
|||||||
# find location of settings.yml
|
# find location of settings.yml
|
||||||
if 'SEARX_SETTINGS_PATH' in environ:
|
if 'SEARX_SETTINGS_PATH' in environ:
|
||||||
# if possible set path to settings using the
|
# if possible set path to settings using the
|
||||||
# enviroment variable SEARX_SETTINGS_PATH
|
# environment variable SEARX_SETTINGS_PATH
|
||||||
return check_settings_yml(environ['SEARX_SETTINGS_PATH'])
|
return check_settings_yml(environ['SEARX_SETTINGS_PATH'])
|
||||||
|
|
||||||
# if not, get it from /etc/searx, or last resort the codebase
|
# if not, get it from /etc/searx, or last resort the codebase
|
||||||
@ -132,7 +132,7 @@ def load_settings(load_user_setttings=True):
|
|||||||
default_settings = load_yaml(default_settings_path)
|
default_settings = load_yaml(default_settings_path)
|
||||||
update_settings(default_settings, user_settings)
|
update_settings(default_settings, user_settings)
|
||||||
return (default_settings,
|
return (default_settings,
|
||||||
'merge the default settings ( {} ) and the user setttings ( {} )'
|
'merge the default settings ( {} ) and the user settings ( {} )'
|
||||||
.format(default_settings_path, user_settings_path))
|
.format(default_settings_path, user_settings_path))
|
||||||
|
|
||||||
# the user settings, fully replace the default configuration
|
# the user settings, fully replace the default configuration
|
||||||
|
@ -96,7 +96,7 @@
|
|||||||
|
|
||||||
{% if 'method' not in locked_preferences %}
|
{% if 'method' not in locked_preferences %}
|
||||||
{% set method_label = _('Method') %}
|
{% set method_label = _('Method') %}
|
||||||
{% set method_info = _('Change how forms are submited, <a href="http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol#Request_methods" rel="external">learn more about request methods</a>') %}
|
{% set method_info = _('Change how forms are submitted, <a href="http://en.wikipedia.org/wiki/Hypertext_Transfer_Protocol#Request_methods" rel="external">learn more about request methods</a>') %}
|
||||||
{{ preferences_item_header(method_info, method_label, rtl, 'method') }}
|
{{ preferences_item_header(method_info, method_label, rtl, 'method') }}
|
||||||
<select class="form-control {{ custom_select_class(rtl) }}" name="method" id="method">
|
<select class="form-control {{ custom_select_class(rtl) }}" name="method" id="method">
|
||||||
<option value="POST" {% if method == 'POST' %}selected="selected"{% endif %}>POST</option>
|
<option value="POST" {% if method == 'POST' %}selected="selected"{% endif %}>POST</option>
|
||||||
|
@ -114,6 +114,6 @@ if __name__ == '__main__':
|
|||||||
run_robot_tests([getattr(robot, x) for x in dir(robot) if x.startswith('test_')])
|
run_robot_tests([getattr(robot, x) for x in dir(robot) if x.startswith('test_')])
|
||||||
except Exception: # pylint: disable=broad-except
|
except Exception: # pylint: disable=broad-except
|
||||||
errors = True
|
errors = True
|
||||||
print('Error occured: {0}'.format(traceback.format_exc()))
|
print('Error occurred: {0}'.format(traceback.format_exc()))
|
||||||
test_layer.tearDown()
|
test_layer.tearDown()
|
||||||
sys.exit(1 if errors else 0)
|
sys.exit(1 if errors else 0)
|
||||||
|
@ -53,7 +53,7 @@ def parse_lang(preferences: Preferences, form: Dict[str, str], raw_text_query: R
|
|||||||
return preferences.get_value('language')
|
return preferences.get_value('language')
|
||||||
# get language
|
# get language
|
||||||
# set specific language if set on request, query or preferences
|
# set specific language if set on request, query or preferences
|
||||||
# TODO support search with multible languages
|
# TODO support search with multiple languages
|
||||||
if len(raw_text_query.languages):
|
if len(raw_text_query.languages):
|
||||||
query_lang = raw_text_query.languages[-1]
|
query_lang = raw_text_query.languages[-1]
|
||||||
elif 'language' in form:
|
elif 'language' in form:
|
||||||
@ -216,7 +216,7 @@ def get_search_query_from_webapp(preferences: Preferences, form: Dict[str, str])
|
|||||||
disabled_engines = preferences.engines.get_disabled()
|
disabled_engines = preferences.engines.get_disabled()
|
||||||
|
|
||||||
# parse query, if tags are set, which change
|
# parse query, if tags are set, which change
|
||||||
# the serch engine or search-language
|
# the search engine or search-language
|
||||||
raw_text_query = RawTextQuery(form['q'], disabled_engines)
|
raw_text_query = RawTextQuery(form['q'], disabled_engines)
|
||||||
|
|
||||||
# set query
|
# set query
|
||||||
@ -231,7 +231,7 @@ def get_search_query_from_webapp(preferences: Preferences, form: Dict[str, str])
|
|||||||
|
|
||||||
if not is_locked('categories') and raw_text_query.enginerefs and raw_text_query.specific:
|
if not is_locked('categories') and raw_text_query.enginerefs and raw_text_query.specific:
|
||||||
# if engines are calculated from query,
|
# if engines are calculated from query,
|
||||||
# set categories by using that informations
|
# set categories by using that information
|
||||||
query_engineref_list = raw_text_query.enginerefs
|
query_engineref_list = raw_text_query.enginerefs
|
||||||
else:
|
else:
|
||||||
# otherwise, using defined categories to
|
# otherwise, using defined categories to
|
||||||
|
@ -225,7 +225,7 @@ def code_highlighter(codelines, language=None):
|
|||||||
language = 'text'
|
language = 'text'
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# find lexer by programing language
|
# find lexer by programming language
|
||||||
lexer = get_lexer_by_name(language, stripall=True)
|
lexer = get_lexer_by_name(language, stripall=True)
|
||||||
except:
|
except:
|
||||||
# if lexer is not found, using default one
|
# if lexer is not found, using default one
|
||||||
|
@ -35,7 +35,7 @@ class UnicodeWriter:
|
|||||||
# Fetch UTF-8 output from the queue ...
|
# Fetch UTF-8 output from the queue ...
|
||||||
data = self.queue.getvalue()
|
data = self.queue.getvalue()
|
||||||
data = data.strip('\x00')
|
data = data.strip('\x00')
|
||||||
# ... and reencode it into the target encoding
|
# ... and re-encode it into the target encoding
|
||||||
data = self.encoder.encode(data)
|
data = self.encoder.encode(data)
|
||||||
# write to the target stream
|
# write to the target stream
|
||||||
self.stream.write(data.decode())
|
self.stream.write(data.decode())
|
||||||
|
@ -13,7 +13,7 @@ from searx.engines.wikidata import send_wikidata_query
|
|||||||
|
|
||||||
|
|
||||||
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
||||||
# so multiple invokation of this script doesn't change currencies.json
|
# so multiple invocation of this script doesn't change currencies.json
|
||||||
SARQL_REQUEST = """
|
SARQL_REQUEST = """
|
||||||
SELECT DISTINCT ?iso4217 ?unit ?unicode ?label ?alias WHERE {
|
SELECT DISTINCT ?iso4217 ?unit ?unicode ?label ?alias WHERE {
|
||||||
?item wdt:P498 ?iso4217; rdfs:label ?label.
|
?item wdt:P498 ?iso4217; rdfs:label ?label.
|
||||||
@ -29,7 +29,7 @@ ORDER BY ?iso4217 ?unit ?unicode ?label ?alias
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
# ORDER BY (with all the query fields) is important to keep a deterministic result order
|
||||||
# so multiple invokation of this script doesn't change currencies.json
|
# so multiple invocation of this script doesn't change currencies.json
|
||||||
SPARQL_WIKIPEDIA_NAMES_REQUEST = """
|
SPARQL_WIKIPEDIA_NAMES_REQUEST = """
|
||||||
SELECT DISTINCT ?iso4217 ?article_name WHERE {
|
SELECT DISTINCT ?iso4217 ?article_name WHERE {
|
||||||
?item wdt:P498 ?iso4217 .
|
?item wdt:P498 ?iso4217 .
|
||||||
|
@ -18,7 +18,7 @@ engines_languages_file = Path(searx_dir) / 'data' / 'engines_languages.json'
|
|||||||
languages_file = Path(searx_dir) / 'languages.py'
|
languages_file = Path(searx_dir) / 'languages.py'
|
||||||
|
|
||||||
|
|
||||||
# Fetchs supported languages for each engine and writes json file with those.
|
# Fetches supported languages for each engine and writes json file with those.
|
||||||
def fetch_supported_languages():
|
def fetch_supported_languages():
|
||||||
|
|
||||||
engines_languages = dict()
|
engines_languages = dict()
|
||||||
|
20
utils/lib.sh
20
utils/lib.sh
@ -451,17 +451,17 @@ install_template() {
|
|||||||
fi
|
fi
|
||||||
|
|
||||||
if [[ -f "${dst}" ]] && cmp --silent "${template_file}" "${dst}" ; then
|
if [[ -f "${dst}" ]] && cmp --silent "${template_file}" "${dst}" ; then
|
||||||
info_msg "file ${dst} allready installed"
|
info_msg "file ${dst} already installed"
|
||||||
return 0
|
return 0
|
||||||
fi
|
fi
|
||||||
|
|
||||||
info_msg "diffrent file ${dst} allready exists on this host"
|
info_msg "different file ${dst} already exists on this host"
|
||||||
|
|
||||||
while true; do
|
while true; do
|
||||||
choose_one _reply "choose next step with file $dst" \
|
choose_one _reply "choose next step with file $dst" \
|
||||||
"replace file" \
|
"replace file" \
|
||||||
"leave file unchanged" \
|
"leave file unchanged" \
|
||||||
"interactiv shell" \
|
"interactive shell" \
|
||||||
"diff files"
|
"diff files"
|
||||||
|
|
||||||
case $_reply in
|
case $_reply in
|
||||||
@ -474,7 +474,7 @@ install_template() {
|
|||||||
"leave file unchanged")
|
"leave file unchanged")
|
||||||
break
|
break
|
||||||
;;
|
;;
|
||||||
"interactiv shell")
|
"interactive shell")
|
||||||
echo -e "// edit ${_Red}${dst}${_creset} to your needs"
|
echo -e "// edit ${_Red}${dst}${_creset} to your needs"
|
||||||
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
||||||
sudo -H -u "${owner}" -i
|
sudo -H -u "${owner}" -i
|
||||||
@ -1018,8 +1018,8 @@ nginx_install_app() {
|
|||||||
|
|
||||||
nginx_include_apps_enabled() {
|
nginx_include_apps_enabled() {
|
||||||
|
|
||||||
# Add the *NGINX_APPS_ENABLED* infrastruture to a nginx server block. Such
|
# Add the *NGINX_APPS_ENABLED* infrastructure to a nginx server block. Such
|
||||||
# infrastruture is already known from fedora and centos, including apps (location
|
# infrastructure is already known from fedora and centos, including apps (location
|
||||||
# directives) from the /etc/nginx/default.d folder into the *default* nginx
|
# directives) from the /etc/nginx/default.d folder into the *default* nginx
|
||||||
# server.
|
# server.
|
||||||
|
|
||||||
@ -1521,7 +1521,7 @@ _apt_pkg_info_is_updated=0
|
|||||||
|
|
||||||
pkg_install() {
|
pkg_install() {
|
||||||
|
|
||||||
# usage: TITEL='install foobar' pkg_install foopkg barpkg
|
# usage: TITLE='install foobar' pkg_install foopkg barpkg
|
||||||
|
|
||||||
rst_title "${TITLE:-installation of packages}" section
|
rst_title "${TITLE:-installation of packages}" section
|
||||||
echo -e "\npackage(s)::\n"
|
echo -e "\npackage(s)::\n"
|
||||||
@ -1557,7 +1557,7 @@ pkg_install() {
|
|||||||
|
|
||||||
pkg_remove() {
|
pkg_remove() {
|
||||||
|
|
||||||
# usage: TITEL='remove foobar' pkg_remove foopkg barpkg
|
# usage: TITLE='remove foobar' pkg_remove foopkg barpkg
|
||||||
|
|
||||||
rst_title "${TITLE:-remove packages}" section
|
rst_title "${TITLE:-remove packages}" section
|
||||||
echo -e "\npackage(s)::\n"
|
echo -e "\npackage(s)::\n"
|
||||||
@ -1623,7 +1623,7 @@ git_clone() {
|
|||||||
# git_clone <url> <path> [<branch> [<user>]]
|
# git_clone <url> <path> [<branch> [<user>]]
|
||||||
#
|
#
|
||||||
# First form uses $CACHE/<name> as destination folder, second form clones
|
# First form uses $CACHE/<name> as destination folder, second form clones
|
||||||
# into <path>. If repository is allready cloned, pull from <branch> and
|
# into <path>. If repository is already cloned, pull from <branch> and
|
||||||
# update working tree (if needed, the caller has to stash local changes).
|
# update working tree (if needed, the caller has to stash local changes).
|
||||||
#
|
#
|
||||||
# git clone https://github.com/searx/searx searx-src origin/master searxlogin
|
# git clone https://github.com/searx/searx searx-src origin/master searxlogin
|
||||||
@ -1696,7 +1696,7 @@ lxc_init_container_env() {
|
|||||||
# usage: lxc_init_container_env <name>
|
# usage: lxc_init_container_env <name>
|
||||||
|
|
||||||
# Create a /.lxcenv file in the root folder. Call this once after the
|
# Create a /.lxcenv file in the root folder. Call this once after the
|
||||||
# container is inital started and before installing any boilerplate stuff.
|
# container is initial started and before installing any boilerplate stuff.
|
||||||
|
|
||||||
info_msg "create /.lxcenv in container $1"
|
info_msg "create /.lxcenv in container $1"
|
||||||
cat <<EOF | lxc exec "${1}" -- bash | prefix_stdout "[${_BBlue}${1}${_creset}] "
|
cat <<EOF | lxc exec "${1}" -- bash | prefix_stdout "[${_BBlue}${1}${_creset}] "
|
||||||
|
@ -107,7 +107,7 @@ show
|
|||||||
:suite: show services of all (or <name>) containers from the LXC suite
|
:suite: show services of all (or <name>) containers from the LXC suite
|
||||||
:images: show information of local images
|
:images: show information of local images
|
||||||
cmd
|
cmd
|
||||||
use single qoutes to evaluate in container's bash, e.g.: 'echo \$(hostname)'
|
use single quotes to evaluate in container's bash, e.g.: 'echo \$(hostname)'
|
||||||
-- run command '...' in all containers of the LXC suite
|
-- run command '...' in all containers of the LXC suite
|
||||||
:<name>: run command '...' in container <name>
|
:<name>: run command '...' in container <name>
|
||||||
install
|
install
|
||||||
@ -178,7 +178,7 @@ main() {
|
|||||||
lxc_delete_container "$2"
|
lxc_delete_container "$2"
|
||||||
fi
|
fi
|
||||||
;;
|
;;
|
||||||
*) usage "uknown or missing container <name> $2"; exit 42;;
|
*) usage "unknown or missing container <name> $2"; exit 42;;
|
||||||
esac
|
esac
|
||||||
;;
|
;;
|
||||||
start|stop)
|
start|stop)
|
||||||
@ -190,7 +190,7 @@ main() {
|
|||||||
info_msg "lxc $1 $2"
|
info_msg "lxc $1 $2"
|
||||||
lxc "$1" "$2" | prefix_stdout "[${_BBlue}${i}${_creset}] "
|
lxc "$1" "$2" | prefix_stdout "[${_BBlue}${i}${_creset}] "
|
||||||
;;
|
;;
|
||||||
*) usage "uknown or missing container <name> $2"; exit 42;;
|
*) usage "unknown or missing container <name> $2"; exit 42;;
|
||||||
esac
|
esac
|
||||||
;;
|
;;
|
||||||
show)
|
show)
|
||||||
|
@ -402,7 +402,7 @@ EOF
|
|||||||
}
|
}
|
||||||
|
|
||||||
enable_debug() {
|
enable_debug() {
|
||||||
warn_msg "Do not enable debug in production enviroments!!"
|
warn_msg "Do not enable debug in production environments!!"
|
||||||
info_msg "Enabling debug option needs to reinstall systemd service!"
|
info_msg "Enabling debug option needs to reinstall systemd service!"
|
||||||
set_service_env_debug true
|
set_service_env_debug true
|
||||||
}
|
}
|
||||||
|
@ -436,7 +436,7 @@ install_settings() {
|
|||||||
choose_one action "What should happen to the settings file? " \
|
choose_one action "What should happen to the settings file? " \
|
||||||
"keep configuration unchanged" \
|
"keep configuration unchanged" \
|
||||||
"use origin settings" \
|
"use origin settings" \
|
||||||
"start interactiv shell"
|
"start interactive shell"
|
||||||
case $action in
|
case $action in
|
||||||
"keep configuration unchanged")
|
"keep configuration unchanged")
|
||||||
info_msg "leave settings file unchanged"
|
info_msg "leave settings file unchanged"
|
||||||
@ -446,7 +446,7 @@ install_settings() {
|
|||||||
info_msg "install origin settings"
|
info_msg "install origin settings"
|
||||||
cp "${SEARX_SETTINGS_TEMPLATE}" "${SEARX_SETTINGS_PATH}"
|
cp "${SEARX_SETTINGS_TEMPLATE}" "${SEARX_SETTINGS_PATH}"
|
||||||
;;
|
;;
|
||||||
"start interactiv shell")
|
"start interactive shell")
|
||||||
backup_file "${SEARX_SETTINGS_PATH}"
|
backup_file "${SEARX_SETTINGS_PATH}"
|
||||||
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
echo -e "// exit with [${_BCyan}CTRL-D${_creset}]"
|
||||||
sudo -H -i
|
sudo -H -i
|
||||||
@ -600,7 +600,7 @@ EOF
|
|||||||
}
|
}
|
||||||
|
|
||||||
enable_debug() {
|
enable_debug() {
|
||||||
warn_msg "Do not enable debug in production enviroments!!"
|
warn_msg "Do not enable debug in production environments!!"
|
||||||
info_msg "try to enable debug mode ..."
|
info_msg "try to enable debug mode ..."
|
||||||
tee_stderr 0.1 <<EOF | sudo -H -i 2>&1 | prefix_stdout "$_service_prefix"
|
tee_stderr 0.1 <<EOF | sudo -H -i 2>&1 | prefix_stdout "$_service_prefix"
|
||||||
cd ${SEARX_SRC}
|
cd ${SEARX_SRC}
|
||||||
@ -833,8 +833,8 @@ rst-doc() {
|
|||||||
|
|
||||||
eval "echo \"$(< "${REPO_ROOT}/docs/build-templates/searx.rst")\""
|
eval "echo \"$(< "${REPO_ROOT}/docs/build-templates/searx.rst")\""
|
||||||
|
|
||||||
# I use ubuntu-20.04 here to demonstrate that versions are also suported,
|
# I use ubuntu-20.04 here to demonstrate that versions are also supported,
|
||||||
# normaly debian-* and ubuntu-* are most the same.
|
# normally debian-* and ubuntu-* are most the same.
|
||||||
|
|
||||||
for DIST_NAME in ubuntu-20.04 arch fedora; do
|
for DIST_NAME in ubuntu-20.04 arch fedora; do
|
||||||
(
|
(
|
||||||
|
@ -27,7 +27,7 @@ disable-logging = true
|
|||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
@ -27,7 +27,7 @@ disable-logging = true
|
|||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
@ -26,7 +26,7 @@ disable-logging = true
|
|||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
@ -26,7 +26,7 @@ disable-logging = true
|
|||||||
# The right granted on the created socket
|
# The right granted on the created socket
|
||||||
chmod-socket = 666
|
chmod-socket = 666
|
||||||
|
|
||||||
# Plugin to use and interpretor config
|
# Plugin to use and interpreter config
|
||||||
single-interpreter = true
|
single-interpreter = true
|
||||||
|
|
||||||
# enable master process
|
# enable master process
|
||||||
|
Loading…
Reference in New Issue
Block a user