Created New architecture proposal (markdown)

Alexandre Flament 2020-07-09 14:45:08 +02:00
parent 85fbc927a7
commit 9fb301b7e8
1 changed files with 73 additions and 0 deletions

@ -0,0 +1,73 @@
![](https://www.planttext.com/api/plantuml/svg/ZLLBJzj04BxdLupC1Qg01Ea1vL1HIaiv58KcBKAeXuqzjbTUxrhxm4GL_xspZpYFGbK4qdg-USytmuMnJ5lNYj7eyLAhtg3-1Sp0Chv7euxb3QiGiZdg9zGP_1u1F7v7UXk4cLRFgs1W0konBrnOhMIGbl7jnT-Kjb6rzIylcYFJpr2IWWMKHRJmNmHAPH44bXPKEGW1M6uT4s854ZgjSZG63JZ3PGLpgaIW9RD3qx5UUXTGAjrWSNGIZ6yas1gXTDPf7DER6yYPX2M2nawm70jOhe11fGGZZxv6hkGOc2m0soN1gRWW2HL5pxSA919CuS3rfk9ZgDmJpvMMG452amccb2lDsfPPdZCHRAdkK2izwJUr7OuBrSlZ3wchK_YwMCpcXCVl0KxjZJZ1wNl0GwVVH4DN4_goDUbxtpDjugW7QH-afr4Aa5qhLGc4pxBY4hCTsKzUe7eLkSNUx08FoYqS3G07uSlVKS_MKI6xSddZ_wxLbiasS8KbbzpItFOKxdd3EomuUqLUCCl-dS9aayAbPzkkZmVY0pxl2Uzw5eYzyUg74DY-7a5gwXH-NCs1kBFuD7j3v-m_T2OxEdwcOOkiwlmIfR5wph6yJYQUdGncArkhOQC6klOrIgASNz8DYJrfSzMsJXA7GvjxRkkmMDdUGcTfCpT9mQ5WbgBUutAkyWPjsCpjV6gw0PoMdlOta9vYIObv2DGf9U8oDeYTJyXoGwjal1FQOox9jsG2FivkWAu4xLBCo-pdPTMKJDvHSUICWSyxh-ySwbLmNw4DXXgD4zO4LI_RZHXv74vM6zGDQz7tfcEI_6Y4IYj7b0CbDtYgqofg3cx4ljZzMa_YKRYXm4eyKIHkZ4CuE3q_dT2Hgob40O6wIxGzqh2YSpEban8EKcYlMRNTQdteMDAP0YzzetFDJ8rc739P4o3fDbm86ZY5yk413YRdxu_8OIeXQV6s4zYYz8pXT1cXu6M9cWJWhwZpIsHhqeqiEnaD9uz-BkY3_lly0G00)
---
<!--
https://www.planttext.com/
@startuml
[Browser] as browser
package "Server" {
[Reverse Proxy] as server
[Filtron] as filtron
[Morty] as morty
[Searx front end] as front
note left of front
actually n processes using Starlette (spawn not forked).
In the future, this can be replaced by a golang version, and embed Filtron and Morty.
No need for uwsgi, guvicorn is started programmaticaly.
end note
browser -down-> server: HTTPS
server -down-> filtron: HTTP
filtron -down-> morty: HTTP
filtron -down-> front: HTTP
}
package "Workers" {
package "worker1" {
"Google Engine"
"Google Video Engine"
"Google News Engine"
"YouTube engine"
}
package "worker2" {
"DuckDuckGo engine"
"DuckDuckDefinition engine"
"Wikipedia engine"
"Wikidata engine"
}
package "worker3" {
"Bing engine"
"Yandex engine"
"Qwant engine"
}
front -down-> worker1: UDS or TCP
front -down-> worker2: UDS or TCP
front -down-> worker3: UDS or TCP
}
note top of Workers
each worker is a Python process started when searx is started.
communication with the "Searx front end" using UDS or TCP (later WebSocket).
each worker has it own HTTPS connection pool, and keep statistics.
an internal API allows the "searx front end" to:
* send a SearchQuery and get the results
* get statistics.
Engines from the same company are grouped on the same worker to use the same HTTP connection pool.
It solves issue #1813 (shared data between workers: no need).
sympy can be a worker: if it crashes, the process can be killed. (solve #384)
worker can be implemented in a different language than Python.
end note
@enduml
```
-->