mirror of https://github.com/zedeus/nitter
Merge branch 'master' into thread-rss
This commit is contained in:
commit
ba5182c732
|
@ -0,0 +1,6 @@
|
|||
*.png
|
||||
*.md
|
||||
LICENSE
|
||||
docker-compose.yml
|
||||
Dockerfile
|
||||
tests/
|
|
@ -3,6 +3,7 @@ nitter
|
|||
*.db
|
||||
/tests/__pycache__
|
||||
/tests/geckodriver.log
|
||||
/tmp
|
||||
/tests/downloaded_files/*
|
||||
/tools/gencss
|
||||
/public/css/style.css
|
||||
nitter.conf
|
||||
|
|
17
Dockerfile
17
Dockerfile
|
@ -1,19 +1,20 @@
|
|||
FROM nimlang/nim:alpine as nim
|
||||
MAINTAINER setenforce@protonmail.com
|
||||
FROM nimlang/nim:1.6.2-alpine-regular as nim
|
||||
LABEL maintainer="setenforce@protonmail.com"
|
||||
EXPOSE 8080
|
||||
|
||||
RUN apk --no-cache add libsass-dev libffi-dev openssl-dev redis openssh-client
|
||||
RUN apk --no-cache add libsass-dev
|
||||
|
||||
COPY . /src/nitter
|
||||
WORKDIR /src/nitter
|
||||
|
||||
RUN nimble build -y -d:release --passC:"-flto" --passL:"-flto" \
|
||||
RUN nimble build -y -d:release -d:danger --passC:"-flto" --passL:"-flto" \
|
||||
&& strip -s nitter \
|
||||
&& nimble scss
|
||||
|
||||
FROM redis:6.0.4-alpine
|
||||
FROM alpine:latest
|
||||
WORKDIR /src/
|
||||
RUN apk --no-cache add pcre-dev sqlite-dev
|
||||
COPY --from=nim /src/nitter/nitter /src/nitter/start.sh /src/nitter/nitter.conf ./
|
||||
RUN apk --no-cache add pcre
|
||||
COPY --from=nim /src/nitter/nitter ./
|
||||
COPY --from=nim /src/nitter/nitter.example.conf ./nitter.conf
|
||||
COPY --from=nim /src/nitter/public ./public
|
||||
CMD ["./start.sh"]
|
||||
CMD ./nitter
|
||||
|
|
57
README.md
57
README.md
|
@ -3,8 +3,10 @@
|
|||
[![Test Matrix](https://github.com/zedeus/nitter/workflows/CI/CD/badge.svg)](https://github.com/zedeus/nitter/actions?query=workflow%3ACI/CD)
|
||||
[![License](https://img.shields.io/github/license/zedeus/nitter?style=flat)](#license)
|
||||
|
||||
A free and open source alternative Twitter front-end focused on privacy. \
|
||||
Inspired by the [Invidious](https://github.com/iv-org/invidious) project.
|
||||
A free and open source alternative Twitter front-end focused on privacy and
|
||||
performance. \
|
||||
Inspired by the [Invidious](https://github.com/iv-org/invidious)
|
||||
project.
|
||||
|
||||
- No JavaScript or ads
|
||||
- All requests go through the backend, client never talks to Twitter
|
||||
|
@ -39,12 +41,14 @@ maintained by the community.
|
|||
|
||||
## Why?
|
||||
|
||||
It's basically impossible to use Twitter without JavaScript enabled. If you try,
|
||||
you're redirected to the legacy mobile version which is awful both functionally
|
||||
and aesthetically. For privacy-minded folks, preventing JavaScript analytics and
|
||||
potential IP-based tracking is important, but apart from using the legacy mobile
|
||||
version and a VPN, it's impossible. This is is especially relevant now that Twitter
|
||||
[removed the ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws)
|
||||
It's impossible to use Twitter without JavaScript enabled. For privacy-minded
|
||||
folks, preventing JavaScript analytics and IP-based tracking is important, but
|
||||
apart from using a VPN and uBlock/uMatrix, it's impossible. Despite being behind
|
||||
a VPN and using heavy-duty adblockers, you can get accurately tracked with your
|
||||
[browser's fingerprint](https://restoreprivacy.com/browser-fingerprinting/),
|
||||
[no JavaScript required](https://noscriptfingerprint.com/). This all became
|
||||
particularly important after Twitter [removed the
|
||||
ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws)
|
||||
for users to control whether their data gets sent to advertisers.
|
||||
|
||||
Using an instance of Nitter (hosted on a VPS for example), you can browse
|
||||
|
@ -62,6 +66,11 @@ Twitter account.
|
|||
|
||||
## Installation
|
||||
|
||||
### Dependencies
|
||||
* libpcre
|
||||
* libsass
|
||||
* redis
|
||||
|
||||
To compile Nitter you need a Nim installation, see
|
||||
[nim-lang.org](https://nim-lang.org/install.html) for details. It is possible to
|
||||
install it system-wide or in the user directory you create below.
|
||||
|
@ -84,7 +93,7 @@ $ git clone https://github.com/zedeus/nitter
|
|||
$ cd nitter
|
||||
$ nimble build -d:release
|
||||
$ nimble scss
|
||||
$ mkdir ./tmp
|
||||
$ cp nitter.example.conf nitter.conf
|
||||
```
|
||||
|
||||
Set your hostname, port, HMAC key, https (must be correct for cookies), and
|
||||
|
@ -92,21 +101,39 @@ Redis info in `nitter.conf`. To run Redis, either run
|
|||
`redis-server --daemonize yes`, or `systemctl enable --now redis` (or
|
||||
redis-server depending on the distro). Run Nitter by executing `./nitter` or
|
||||
using the systemd service below. You should run Nitter behind a reverse proxy
|
||||
such as [Nginx](https://github.com/zedeus/nitter/wiki/Nginx) or Apache for
|
||||
security reasons.
|
||||
such as [Nginx](https://github.com/zedeus/nitter/wiki/Nginx) or
|
||||
[Apache](https://github.com/zedeus/nitter/wiki/Apache) for security and
|
||||
performance reasons.
|
||||
|
||||
### Docker
|
||||
|
||||
#### NOTE: For ARM64/ARM support, please use [unixfox's image](https://quay.io/repository/unixfox/nitter?tab=tags), more info [here](https://github.com/zedeus/nitter/issues/399#issuecomment-997263495)
|
||||
|
||||
To run Nitter with Docker, you'll need to install and run Redis separately
|
||||
before you can run the container. See below for how to also run Redis using
|
||||
Docker.
|
||||
|
||||
To build and run Nitter in Docker:
|
||||
```bash
|
||||
docker build -t nitter:latest .
|
||||
docker run -v $(pwd)/nitter.conf:/src/nitter.conf -d -p 8080:8080 nitter:latest
|
||||
docker run -v $(pwd)/nitter.conf:/src/nitter.conf -d --network host nitter:latest
|
||||
```
|
||||
|
||||
A prebuilt Docker image is provided as well:
|
||||
```bash
|
||||
docker run -v $(pwd)/nitter.conf:/src/nitter.conf -d -p 8080:8080 zedeus/nitter:latest
|
||||
docker run -v $(pwd)/nitter.conf:/src/nitter.conf -d --network host zedeus/nitter:latest
|
||||
```
|
||||
|
||||
Note the Docker commands expect a `nitter.conf` file in the directory you run them.
|
||||
Using docker-compose to run both Nitter and Redis as different containers:
|
||||
Change `redisHost` from `localhost` to `redis` in `nitter.conf`, then run:
|
||||
```bash
|
||||
docker-compose up -d
|
||||
```
|
||||
|
||||
Note the Docker commands expect a `nitter.conf` file in the directory you run
|
||||
them.
|
||||
|
||||
### systemd
|
||||
|
||||
To run Nitter via systemd you can use this service file:
|
||||
|
||||
|
@ -137,6 +164,8 @@ WantedBy=multi-user.target
|
|||
Then enable and run the service:
|
||||
`systemctl enable --now nitter.service`
|
||||
|
||||
### Logging
|
||||
|
||||
Nitter currently prints some errors to stdout, and there is no real logging
|
||||
implemented. If you're running Nitter with systemd, you can check stdout like
|
||||
this: `journalctl -u nitter.service` (add `--follow` to see just the last 15
|
||||
|
|
|
@ -6,4 +6,11 @@
|
|||
|
||||
# disable annoying warnings
|
||||
warning("GcUnsafe2", off)
|
||||
warning("ObservableStores", off)
|
||||
hint("XDeclaredButNotUsed", off)
|
||||
hint("User", off)
|
||||
|
||||
const
|
||||
nimVersion = (major: NimMajor, minor: NimMinor, patch: NimPatch)
|
||||
|
||||
when nimVersion >= (1, 6, 0):
|
||||
warning("HoleEnumConv", off)
|
||||
|
|
|
@ -0,0 +1,18 @@
|
|||
version: "3.8"
|
||||
services:
|
||||
redis:
|
||||
image: redis:6-alpine
|
||||
restart: unless-stopped
|
||||
volumes:
|
||||
- redis-data:/var/lib/redis
|
||||
nitter:
|
||||
image: zedeus/nitter:latest
|
||||
restart: unless-stopped
|
||||
depends_on:
|
||||
- redis
|
||||
ports:
|
||||
- "8080:8080"
|
||||
volumes:
|
||||
- ./nitter.conf:/src/nitter.conf
|
||||
volumes:
|
||||
redis-data:
|
|
@ -10,17 +10,22 @@ hostname = "nitter.net"
|
|||
[Cache]
|
||||
listMinutes = 240 # how long to cache list info (not the tweets, so keep it high)
|
||||
rssMinutes = 10 # how long to cache rss queries
|
||||
redisHost = "localhost"
|
||||
redisHost = "localhost" # Change to "redis" if using docker-compose
|
||||
redisPort = 6379
|
||||
redisConnections = 20 # connection pool size
|
||||
redisPassword = ""
|
||||
redisConnections = 20 # connection pool size
|
||||
redisMaxConnections = 30
|
||||
# max, new connections are opened when none are available, but if the pool size
|
||||
# goes above this, they're closed when released. don't worry about this unless
|
||||
# you receive tons of requests per second
|
||||
|
||||
[Config]
|
||||
hmacKey = "secretkey" # random key for cryptographic signing of video urls
|
||||
base64Media = false # use base64 encoding for proxied media urls
|
||||
hmacKey = "secretkey" # random key for cryptographic signing of video urls
|
||||
base64Media = false # use base64 encoding for proxied media urls
|
||||
enableRSS = true # set this to false to disable RSS feeds
|
||||
enableDebug = false # enable request logs and debug endpoints
|
||||
proxy = "" # http/https url, SOCKS proxies are not supported
|
||||
proxyAuth = ""
|
||||
tokenCount = 10
|
||||
# minimum amount of usable tokens. tokens are used to authorize API requests,
|
||||
# but they expire after ~1 hour, and have a limit of 187 requests.
|
||||
|
@ -33,6 +38,7 @@ tokenCount = 10
|
|||
theme = "Nitter"
|
||||
replaceTwitter = "nitter.net"
|
||||
replaceYouTube = "piped.kavin.rocks"
|
||||
replaceReddit = "teddit.net"
|
||||
replaceInstagram = ""
|
||||
proxyVideos = true
|
||||
hlsPlayback = false
|
|
@ -10,21 +10,22 @@ bin = @["nitter"]
|
|||
|
||||
# Dependencies
|
||||
|
||||
requires "nim >= 1.2.0"
|
||||
requires "nim >= 1.4.8"
|
||||
requires "jester >= 0.5.0"
|
||||
requires "karax >= 1.1.2"
|
||||
requires "karax#c71bc92"
|
||||
requires "sass#e683aa1"
|
||||
requires "regex#2e32fdc"
|
||||
requires "nimcrypto >= 0.4.11"
|
||||
requires "regex#eeefb4f"
|
||||
requires "nimcrypto#a5742a9"
|
||||
requires "markdown#abdbe5e"
|
||||
requires "packedjson#7198cc8"
|
||||
requires "supersnappy#1.1.5"
|
||||
requires "redpool#57aeb25"
|
||||
requires "https://github.com/zedeus/redis#94bcbf1"
|
||||
requires "https://github.com/disruptek/frosty#0.3.1"
|
||||
requires "packedjson#d11d167"
|
||||
requires "supersnappy#2.1.1"
|
||||
requires "redpool#f880f49"
|
||||
requires "https://github.com/zedeus/redis#d0a0e6f"
|
||||
requires "zippy#0.7.3"
|
||||
requires "flatty#0.2.3"
|
||||
|
||||
|
||||
# Tasks
|
||||
|
||||
task scss, "Generate css":
|
||||
exec "nim c --hint[Processing]:off -r tools/gencss"
|
||||
exec "nim c --hint[Processing]:off -d:danger -r tools/gencss"
|
||||
|
|
Binary file not shown.
After Width: | Height: | Size: 2.3 KiB |
|
@ -1,4 +1,5 @@
|
|||
// @license http://www.gnu.org/licenses/agpl-3.0.html AGPL-3.0
|
||||
// SPDX-License-Identifier: AGPL-3.0-only
|
||||
function playVideo(overlay) {
|
||||
const video = overlay.parentElement.querySelector('video');
|
||||
const url = video.getAttribute("data-url");
|
||||
|
@ -16,7 +17,7 @@ function playVideo(overlay) {
|
|||
});
|
||||
} else if (video.canPlayType('application/vnd.apple.mpegurl')) {
|
||||
video.src = url;
|
||||
video.addEventListened('canplay', function() {
|
||||
video.addEventListener('canplay', function() {
|
||||
video.play();
|
||||
});
|
||||
}
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
// @license http://www.gnu.org/licenses/agpl-3.0.html AGPL-3.0
|
||||
// SPDX-License-Identifier: AGPL-3.0-only
|
||||
function insertBeforeLast(node, elem) {
|
||||
node.insertBefore(elem, node.childNodes[node.childNodes.length - 2]);
|
||||
}
|
||||
|
|
|
@ -1,7 +1,8 @@
|
|||
# About
|
||||
|
||||
Nitter is a free and open source alternative Twitter front-end focused on
|
||||
privacy. The source is available on GitHub at <https://github.com/zedeus/nitter>
|
||||
privacy and performance. The source is available on GitHub at
|
||||
<https://github.com/zedeus/nitter>
|
||||
|
||||
* No JavaScript or ads
|
||||
* All requests go through the backend, client never talks to Twitter
|
||||
|
@ -20,16 +21,20 @@ maintained by the community.
|
|||
|
||||
## Why use Nitter?
|
||||
|
||||
It's basically impossible to use Twitter without JavaScript enabled. If you try,
|
||||
you're redirected to the legacy mobile version which is awful both functionally
|
||||
and aesthetically. For privacy-minded folks, preventing JavaScript analytics and
|
||||
potential IP-based tracking is important, but apart from using the legacy mobile
|
||||
version and a VPN, it's impossible.
|
||||
It's impossible to use Twitter without JavaScript enabled. For privacy-minded
|
||||
folks, preventing JavaScript analytics and IP-based tracking is important, but
|
||||
apart from using a VPN and uBlock/uMatrix, it's impossible. Despite being behind
|
||||
a VPN and using heavy-duty adblockers, you can get accurately tracked with your
|
||||
[browser's fingerprint](https://restoreprivacy.com/browser-fingerprinting/),
|
||||
[no JavaScript required](https://noscriptfingerprint.com/). This all became
|
||||
particularly important after Twitter [removed the
|
||||
ability](https://www.eff.org/deeplinks/2020/04/twitter-removes-privacy-option-and-shows-why-we-need-strong-privacy-laws)
|
||||
for users to control whether their data gets sent to advertisers.
|
||||
|
||||
Using an instance of Nitter (hosted on a VPS for example), you can browse
|
||||
Twitter without JavaScript while retaining your privacy. In addition to
|
||||
respecting your privacy, Nitter is on average around 15 times lighter than
|
||||
Twitter, and in some cases serves pages faster.
|
||||
Twitter, and in most cases serves pages faster (eg. timelines load 2-4x faster).
|
||||
|
||||
In the future a simple account system will be added that lets you follow Twitter
|
||||
users, allowing you to have a clean chronological timeline without needing a
|
||||
|
@ -46,5 +51,4 @@ XMR: 42hKayRoEAw4D6G6t8mQHPJHQcXqofjFuVfavqKeNMNUZfeJLJAcNU19i1bGdDvcdN6romiSscW
|
|||
|
||||
## Contact
|
||||
|
||||
Feel free to join our Freenode IRC channel at #nitter, or our
|
||||
[Matrix channel](https://matrix.to/#/#nitter:matrix.org).
|
||||
Feel free to join our [Matrix channel](https://matrix.to/#/#nitter:matrix.org).
|
||||
|
|
|
@ -11,6 +11,11 @@
|
|||
"src": "/android-chrome-384x384.png",
|
||||
"sizes": "384x384",
|
||||
"type": "image/png"
|
||||
},
|
||||
{
|
||||
"src": "/android-chrome-512x512.png",
|
||||
"sizes": "512x512",
|
||||
"type": "image/png"
|
||||
}
|
||||
],
|
||||
"theme_color": "#333333",
|
||||
|
|
BIN
screenshot.png
BIN
screenshot.png
Binary file not shown.
Before Width: | Height: | Size: 912 KiB After Width: | Height: | Size: 957 KiB |
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import random, strformat, strutils, sequtils
|
||||
|
||||
randomize()
|
||||
|
@ -30,7 +31,7 @@ proc windows(): string =
|
|||
trident = ["", "; Trident/5.0", "; Trident/6.0", "; Trident/7.0"]
|
||||
"Windows " & sample(nt) & sample(enc) & sample(arch) & sample(trident)
|
||||
|
||||
let macs = toSeq(6..15).mapIt($it) & @["14_4", "10_1", "9_3"]
|
||||
const macs = toSeq(6..15).mapIt($it) & @["14_4", "10_1", "9_3"]
|
||||
|
||||
proc mac(): string =
|
||||
"Macintosh; Intel Mac OS X 10_" & sample(macs) & sample(enc)
|
||||
|
|
61
src/api.nim
61
src/api.nim
|
@ -1,60 +1,62 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, httpclient, uri, strutils
|
||||
import packedjson
|
||||
import types, query, formatters, consts, apiutils, parser
|
||||
|
||||
proc getGraphProfile*(username: string): Future[Profile] {.async.} =
|
||||
let
|
||||
variables = %*{"screen_name": username, "withHighlightedLabel": true}
|
||||
js = await fetch(graphUser ? {"variables": $variables})
|
||||
result = parseGraphProfile(js, username)
|
||||
|
||||
proc getGraphList*(name, list: string): Future[List] {.async.} =
|
||||
proc getGraphListBySlug*(name, list: string): Future[List] {.async.} =
|
||||
let
|
||||
variables = %*{"screenName": name, "listSlug": list, "withHighlightedLabel": false}
|
||||
js = await fetch(graphList ? {"variables": $variables})
|
||||
result = parseGraphList(js)
|
||||
url = graphListBySlug ? {"variables": $variables}
|
||||
result = parseGraphList(await fetch(url, Api.listBySlug))
|
||||
|
||||
proc getGraphListById*(id: string): Future[List] {.async.} =
|
||||
proc getGraphList*(id: string): Future[List] {.async.} =
|
||||
let
|
||||
variables = %*{"listId": id, "withHighlightedLabel": false}
|
||||
js = await fetch(graphListId ? {"variables": $variables})
|
||||
result = parseGraphList(js)
|
||||
url = graphList ? {"variables": $variables}
|
||||
result = parseGraphList(await fetch(url, Api.list))
|
||||
|
||||
proc getListTimeline*(id: string; after=""): Future[Timeline] {.async.} =
|
||||
if id.len == 0: return
|
||||
let
|
||||
ps = genParams({"list_id": id, "ranking_mode": "reverse_chronological"}, after)
|
||||
url = listTimeline ? ps
|
||||
result = parseTimeline(await fetch(url), after)
|
||||
result = parseTimeline(await fetch(url, Api.timeline), after)
|
||||
|
||||
proc getListMembers*(list: List; after=""): Future[Result[Profile]] {.async.} =
|
||||
if list.id.len == 0: return
|
||||
let
|
||||
ps = genParams({"list_id": list.id}, after)
|
||||
url = listMembers ? ps
|
||||
result = parseListMembers(await fetch(url, oldApi=true), after)
|
||||
result = parseListMembers(await fetch(url, Api.listMembers), after)
|
||||
|
||||
proc getProfile*(username: string): Future[Profile] {.async.} =
|
||||
let
|
||||
ps = genParams({"screen_name": username})
|
||||
url = userShow ? ps
|
||||
result = parseUserShow(await fetch(url, oldApi=true), username)
|
||||
js = await fetch(userShow ? ps, Api.userShow)
|
||||
result = parseUserShow(js, username=username)
|
||||
|
||||
proc getProfileById*(userId: string): Future[Profile] {.async.} =
|
||||
let
|
||||
ps = genParams({"user_id": userId})
|
||||
js = await fetch(userShow ? ps, Api.userShow)
|
||||
result = parseUserShow(js, id=userId)
|
||||
|
||||
proc getTimeline*(id: string; after=""; replies=false): Future[Timeline] {.async.} =
|
||||
let
|
||||
ps = genParams({"userId": id, "include_tweet_replies": $replies}, after)
|
||||
url = timeline / (id & ".json") ? ps
|
||||
result = parseTimeline(await fetch(url), after)
|
||||
result = parseTimeline(await fetch(url, Api.timeline), after)
|
||||
|
||||
proc getMediaTimeline*(id: string; after=""): Future[Timeline] {.async.} =
|
||||
let url = mediaTimeline / (id & ".json") ? genParams(cursor=after)
|
||||
result = parseTimeline(await fetch(url), after)
|
||||
result = parseTimeline(await fetch(url, Api.timeline), after)
|
||||
|
||||
proc getPhotoRail*(name: string): Future[PhotoRail] {.async.} =
|
||||
let
|
||||
ps = genParams({"screen_name": name, "trim_user": "true"},
|
||||
count="18", ext=false)
|
||||
url = photoRail ? ps
|
||||
result = parsePhotoRail(await fetch(url, oldApi=true))
|
||||
result = parsePhotoRail(await fetch(url, Api.photoRail))
|
||||
|
||||
proc getSearch*[T](query: Query; after=""): Future[Result[T]] {.async.} =
|
||||
when T is Profile:
|
||||
|
@ -66,15 +68,20 @@ proc getSearch*[T](query: Query; after=""): Future[Result[T]] {.async.} =
|
|||
searchMode = ("tweet_search_mode", "live")
|
||||
parse = parseTimeline
|
||||
|
||||
let
|
||||
q = genQueryParam(query)
|
||||
url = search ? genParams(searchParams & @[("q", q), searchMode], after)
|
||||
result = parse(await fetch(url), after)
|
||||
result.query = query
|
||||
let q = genQueryParam(query)
|
||||
if q.len == 0 or q == emptyQuery:
|
||||
return Result[T](beginning: true, query: query)
|
||||
|
||||
let url = search ? genParams(searchParams & @[("q", q), searchMode], after)
|
||||
try:
|
||||
result = parse(await fetch(url, Api.search), after)
|
||||
result.query = query
|
||||
except InternalError:
|
||||
return Result[T](beginning: true, query: query)
|
||||
|
||||
proc getTweetImpl(id: string; after=""): Future[Conversation] {.async.} =
|
||||
let url = tweet / (id & ".json") ? genParams(cursor=after)
|
||||
result = parseConversation(await fetch(url), id)
|
||||
result = parseConversation(await fetch(url, Api.tweet), id)
|
||||
|
||||
proc getReplies*(id, after: string): Future[Result[Chain]] {.async.} =
|
||||
result = (await getTweetImpl(id, after)).replies
|
||||
|
@ -88,8 +95,8 @@ proc getTweet*(id: string; after=""): Future[Conversation] {.async.} =
|
|||
proc resolve*(url: string; prefs: Prefs): Future[string] {.async.} =
|
||||
let client = newAsyncHttpClient(maxRedirects=0)
|
||||
try:
|
||||
let resp = await client.request(url, $HttpHead)
|
||||
result = resp.headers["location"].replaceUrl(prefs)
|
||||
let resp = await client.request(url, HttpHead)
|
||||
result = resp.headers["location"].replaceUrls(prefs)
|
||||
except:
|
||||
discard
|
||||
finally:
|
||||
|
|
|
@ -1,22 +1,30 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import httpclient, asyncdispatch, options, times, strutils, uri
|
||||
import packedjson
|
||||
import packedjson, zippy
|
||||
import types, tokens, consts, parserutils, http_pool
|
||||
|
||||
const rl = "x-rate-limit-"
|
||||
const
|
||||
rlRemaining = "x-rate-limit-remaining"
|
||||
rlReset = "x-rate-limit-reset"
|
||||
|
||||
var pool {.threadvar.}: HttpPool
|
||||
var pool: HttpPool
|
||||
|
||||
proc genParams*(pars: openarray[(string, string)] = @[]; cursor="";
|
||||
proc genParams*(pars: openArray[(string, string)] = @[]; cursor="";
|
||||
count="20"; ext=true): seq[(string, string)] =
|
||||
result = timelineParams
|
||||
for p in pars:
|
||||
result &= p
|
||||
if ext:
|
||||
result &= ("ext", "mediaStats")
|
||||
if cursor.len > 0:
|
||||
result &= ("cursor", cursor)
|
||||
if count.len > 0:
|
||||
result &= ("count", count)
|
||||
if cursor.len > 0:
|
||||
# The raw cursor often has plus signs, which sometimes get turned into spaces,
|
||||
# so we need to them back into a plus
|
||||
if " " in cursor:
|
||||
result &= ("cursor", cursor.replace(" ", "+"))
|
||||
else:
|
||||
result &= ("cursor", cursor)
|
||||
|
||||
proc genHeaders*(token: Token = nil): HttpHeaders =
|
||||
result = newHttpHeaders({
|
||||
|
@ -26,46 +34,58 @@ proc genHeaders*(token: Token = nil): HttpHeaders =
|
|||
"x-guest-token": if token == nil: "" else: token.tok,
|
||||
"x-twitter-active-user": "yes",
|
||||
"authority": "api.twitter.com",
|
||||
"accept-encoding": "gzip",
|
||||
"accept-language": "en-US,en;q=0.9",
|
||||
"accept": "*/*",
|
||||
"DNT": "1"
|
||||
})
|
||||
|
||||
proc fetch*(url: Uri; oldApi=false): Future[JsonNode] {.async.} =
|
||||
proc fetch*(url: Uri; api: Api): Future[JsonNode] {.async.} =
|
||||
once:
|
||||
pool = HttpPool()
|
||||
|
||||
var token = await getToken()
|
||||
var token = await getToken(api)
|
||||
if token.tok.len == 0:
|
||||
raise rateLimitError()
|
||||
|
||||
let headers = genHeaders(token)
|
||||
try:
|
||||
var resp: AsyncResponse
|
||||
let body = pool.use(headers):
|
||||
var body = pool.use(headers):
|
||||
resp = await c.get($url)
|
||||
await resp.body
|
||||
|
||||
if body.len > 0:
|
||||
if resp.headers.getOrDefault("content-encoding") == "gzip":
|
||||
body = uncompress(body, dfGzip)
|
||||
else:
|
||||
echo "non-gzip body, url: ", url, ", body: ", body
|
||||
|
||||
if body.startsWith('{') or body.startsWith('['):
|
||||
result = parseJson(body)
|
||||
else:
|
||||
echo resp.status, ": ", body
|
||||
result = newJNull()
|
||||
|
||||
if not oldApi and resp.headers.hasKey(rl & "reset"):
|
||||
let time = fromUnix(parseInt(resp.headers[rl & "reset"]))
|
||||
if token.reset != time:
|
||||
token.remaining = parseInt(resp.headers[rl & "limit"])
|
||||
token.reset = time
|
||||
if api != Api.search and resp.headers.hasKey(rlRemaining):
|
||||
let
|
||||
remaining = parseInt(resp.headers[rlRemaining])
|
||||
reset = parseInt(resp.headers[rlReset])
|
||||
token.setRateLimit(api, remaining, reset)
|
||||
|
||||
if result.getError notin {invalidToken, forbidden, badToken}:
|
||||
token.lastUse = getTime()
|
||||
release(token, used=true)
|
||||
else:
|
||||
echo "fetch error: ", result.getError
|
||||
release(token, true)
|
||||
release(token, invalid=true)
|
||||
raise rateLimitError()
|
||||
|
||||
if resp.status == $Http400:
|
||||
raise newException(InternalError, $url)
|
||||
except InternalError as e:
|
||||
raise e
|
||||
except Exception as e:
|
||||
echo "error: ", e.msg, ", token: ", token[], ", url: ", url
|
||||
echo "error: ", e.name, ", msg: ", e.msg, ", token: ", token[], ", url: ", url
|
||||
if "length" notin e.msg and "descriptor" notin e.msg:
|
||||
release(token, true)
|
||||
release(token, invalid=true)
|
||||
raise rateLimitError()
|
||||
|
|
|
@ -1,8 +1,9 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import parsecfg except Config
|
||||
import types, strutils
|
||||
|
||||
proc get*[T](config: parseCfg.Config; s, v: string; default: T): T =
|
||||
let val = config.getSectionValue(s, v)
|
||||
proc get*[T](config: parseCfg.Config; section, key: string; default: T): T =
|
||||
let val = config.getSectionValue(section, key)
|
||||
if val.len == 0: return default
|
||||
|
||||
when T is int: parseInt(val)
|
||||
|
@ -13,19 +14,16 @@ proc getConfig*(path: string): (Config, parseCfg.Config) =
|
|||
var cfg = loadConfig(path)
|
||||
|
||||
let conf = Config(
|
||||
# Server
|
||||
address: cfg.get("Server", "address", "0.0.0.0"),
|
||||
port: cfg.get("Server", "port", 8080),
|
||||
useHttps: cfg.get("Server", "https", true),
|
||||
httpMaxConns: cfg.get("Server", "httpMaxConnections", 100),
|
||||
|
||||
staticDir: cfg.get("Server", "staticDir", "./public"),
|
||||
title: cfg.get("Server", "title", "Nitter"),
|
||||
hostname: cfg.get("Server", "hostname", "nitter.net"),
|
||||
staticDir: cfg.get("Server", "staticDir", "./public"),
|
||||
|
||||
hmacKey: cfg.get("Config", "hmacKey", "secretkey"),
|
||||
base64Media: cfg.get("Config", "base64Media", false),
|
||||
minTokens: cfg.get("Config", "tokenCount", 10),
|
||||
|
||||
# Cache
|
||||
listCacheTime: cfg.get("Cache", "listMinutes", 120),
|
||||
rssCacheTime: cfg.get("Cache", "rssMinutes", 10),
|
||||
|
||||
|
@ -33,8 +31,16 @@ proc getConfig*(path: string): (Config, parseCfg.Config) =
|
|||
redisPort: cfg.get("Cache", "redisPort", 6379),
|
||||
redisConns: cfg.get("Cache", "redisConnections", 20),
|
||||
redisMaxConns: cfg.get("Cache", "redisMaxConnections", 30),
|
||||
redisPassword: cfg.get("Cache", "redisPassword", ""),
|
||||
|
||||
replaceYouTube: cfg.get("Preferences", "replaceYouTube", "piped.kavin.rocks")
|
||||
# Config
|
||||
hmacKey: cfg.get("Config", "hmacKey", "secretkey"),
|
||||
base64Media: cfg.get("Config", "base64Media", false),
|
||||
minTokens: cfg.get("Config", "tokenCount", 10),
|
||||
enableRss: cfg.get("Config", "enableRSS", true),
|
||||
enableDebug: cfg.get("Config", "enableDebug", false),
|
||||
proxy: cfg.get("Config", "proxy", ""),
|
||||
proxyAuth: cfg.get("Config", "proxyAuth", "")
|
||||
)
|
||||
|
||||
return (conf, cfg)
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import uri, sequtils
|
||||
|
||||
const
|
||||
|
@ -12,15 +13,14 @@ const
|
|||
search* = api / "2/search/adaptive.json"
|
||||
|
||||
timelineApi = api / "2/timeline"
|
||||
tweet* = timelineApi / "conversation"
|
||||
timeline* = timelineApi / "profile"
|
||||
mediaTimeline* = timelineApi / "media"
|
||||
listTimeline* = timelineApi / "list.json"
|
||||
tweet* = timelineApi / "conversation"
|
||||
|
||||
graphql = api / "graphql"
|
||||
graphUser* = graphql / "E4iSsd6gypGFWx2eUhSC1g/UserByScreenName"
|
||||
graphList* = graphql / "ErWsz9cObLel1BF-HjuBlA/ListBySlug"
|
||||
graphListId* = graphql / "JADTh6cjebfgetzvF3tQvQ/List"
|
||||
graphListBySlug* = graphql / "ErWsz9cObLel1BF-HjuBlA/ListBySlug"
|
||||
graphList* = graphql / "JADTh6cjebfgetzvF3tQvQ/List"
|
||||
|
||||
timelineParams* = {
|
||||
"include_profile_interstitial_type": "0",
|
||||
|
|
|
@ -1,25 +1,36 @@
|
|||
import strutils, strformat, times, uri, tables, xmltree, htmlparser
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, strformat, times, uri, tables, xmltree, htmlparser, htmlgen
|
||||
import regex
|
||||
import types, utils, query
|
||||
|
||||
const
|
||||
ytRegex = re"([A-z.]+\.)?youtu(be\.com|\.be)"
|
||||
twRegex = re"(www\.|mobile\.)?twitter\.com"
|
||||
igRegex = re"(www\.)?instagram.com"
|
||||
igRegex = re"(www\.)?instagram\.com"
|
||||
|
||||
rdRegex = re"(?<![.b])((www|np|new|amp|old)\.)?reddit.com"
|
||||
rdShortRegex = re"(?<![.b])redd\.it\/"
|
||||
# Videos cannot be supported uniformly between Teddit and Libreddit,
|
||||
# so v.redd.it links will not be replaced.
|
||||
# Images aren't supported due to errors from Teddit when the image
|
||||
# wasn't first displayed via a post on the Teddit instance.
|
||||
|
||||
twRegex = re"(?<=(?<!\S)https:\/\/|(?<=\s))(www\.|mobile\.)?twitter\.com"
|
||||
twLinkRegex = re"""<a href="https:\/\/twitter.com([^"]+)">twitter\.com(\S+)</a>"""
|
||||
|
||||
cards = "cards.twitter.com/cards"
|
||||
tco = "https://t.co"
|
||||
|
||||
wwwRegex = re"https?://(www[0-9]?\.)?"
|
||||
m3u8Regex = re"""url="(.+.m3u8)""""
|
||||
manifestRegex = re"\/(.+(.ts|.m4s|.m3u8|.vmap|.mp4))"
|
||||
userpicRegex = re"_(normal|bigger|mini|200x200|400x400)(\.[A-z]+)$"
|
||||
userPicRegex = re"_(normal|bigger|mini|200x200|400x400)(\.[A-z]+)$"
|
||||
extRegex = re"(\.[A-z]+)$"
|
||||
illegalXmlRegex = re"[^\x09\x0A\x0D\x20-\uD7FF\uE000-\uFFFD\u10000-\u10FFFF]"
|
||||
|
||||
twitter = parseUri("https://twitter.com")
|
||||
|
||||
proc getUrlPrefix*(cfg: Config): string =
|
||||
if cfg.useHttps: "https://" & cfg.hostname
|
||||
if cfg.useHttps: https & cfg.hostname
|
||||
else: "http://" & cfg.hostname
|
||||
|
||||
proc stripHtml*(text: string): string =
|
||||
|
@ -39,19 +50,32 @@ proc shortLink*(text: string; length=28): string =
|
|||
if result.len > length:
|
||||
result = result[0 ..< length] & "…"
|
||||
|
||||
proc replaceUrl*(url: string; prefs: Prefs; absolute=""): string =
|
||||
result = url
|
||||
if prefs.replaceYouTube.len > 0:
|
||||
proc replaceUrls*(body: string; prefs: Prefs; absolute=""): string =
|
||||
result = body
|
||||
|
||||
if prefs.replaceYouTube.len > 0 and ytRegex in result:
|
||||
result = result.replace(ytRegex, prefs.replaceYouTube)
|
||||
if prefs.replaceYouTube in result:
|
||||
result = result.replace("/c/", "/")
|
||||
if prefs.replaceInstagram.len > 0:
|
||||
result = result.replace(igRegex, prefs.replaceInstagram)
|
||||
if prefs.replaceTwitter.len > 0:
|
||||
result = result.replace(tco, "https://" & prefs.replaceTwitter & "/t.co")
|
||||
|
||||
if prefs.replaceTwitter.len > 0 and
|
||||
(twRegex in result or twLinkRegex in result or tco in result):
|
||||
result = result.replace(tco, https & prefs.replaceTwitter & "/t.co")
|
||||
result = result.replace(cards, prefs.replaceTwitter & "/cards")
|
||||
result = result.replace(twRegex, prefs.replaceTwitter)
|
||||
if absolute.len > 0:
|
||||
result = result.replace(twLinkRegex, a(
|
||||
prefs.replaceTwitter & "$2", href = https & prefs.replaceTwitter & "$1"))
|
||||
|
||||
if prefs.replaceReddit.len > 0 and (rdRegex in result or "redd.it" in result):
|
||||
result = result.replace(rdShortRegex, prefs.replaceReddit & "/comments/")
|
||||
result = result.replace(rdRegex, prefs.replaceReddit)
|
||||
if prefs.replaceReddit in result and "/gallery/" in result:
|
||||
result = result.replace("/gallery/", "/comments/")
|
||||
|
||||
if prefs.replaceInstagram.len > 0 and igRegex in result:
|
||||
result = result.replace(igRegex, prefs.replaceInstagram)
|
||||
|
||||
if absolute.len > 0 and "href" in result:
|
||||
result = result.replace("href=\"/", "href=\"" & absolute & "/")
|
||||
|
||||
proc getM3u8Url*(content: string): string =
|
||||
|
@ -65,12 +89,12 @@ proc proxifyVideo*(manifest: string; proxy: bool): string =
|
|||
if proxy: result = getVidUrl(result)
|
||||
result = manifest.replace(manifestRegex, cb)
|
||||
|
||||
proc getUserpic*(userpic: string; style=""): string =
|
||||
let pic = userpic.replace(userpicRegex, "$2")
|
||||
proc getUserPic*(userPic: string; style=""): string =
|
||||
let pic = userPic.replace(userPicRegex, "$2")
|
||||
pic.replace(extRegex, style & "$1")
|
||||
|
||||
proc getUserpic*(profile: Profile; style=""): string =
|
||||
getUserPic(profile.userpic, style)
|
||||
proc getUserPic*(profile: Profile; style=""): string =
|
||||
getUserPic(profile.userPic, style)
|
||||
|
||||
proc getVideoEmbed*(cfg: Config; id: int64): string =
|
||||
&"{getUrlPrefix(cfg)}/i/videos/{id}"
|
||||
|
@ -94,22 +118,16 @@ proc getJoinDateFull*(profile: Profile): string =
|
|||
profile.joinDate.format("h:mm tt - d MMM YYYY")
|
||||
|
||||
proc getTime*(tweet: Tweet): string =
|
||||
tweet.time.format("d/M/yyyy', 'HH:mm:ss")
|
||||
tweet.time.format("MMM d', 'YYYY' · 'h:mm tt' UTC'")
|
||||
|
||||
proc getRfc822Time*(tweet: Tweet): string =
|
||||
tweet.time.format("ddd', 'dd MMM yyyy HH:mm:ss 'GMT'")
|
||||
|
||||
proc getTweetTime*(tweet: Tweet): string =
|
||||
tweet.time.format("h:mm tt' · 'MMM d', 'YYYY")
|
||||
|
||||
proc getShortTime*(tweet: Tweet): string =
|
||||
let now = now()
|
||||
var then = tweet.time.local()
|
||||
then.utcOffset = 0
|
||||
let since = now - tweet.time
|
||||
|
||||
let since = now - then
|
||||
|
||||
if now.year != then.year:
|
||||
if now.year != tweet.time.year:
|
||||
result = tweet.time.format("d MMM yyyy")
|
||||
elif since.inDays >= 1:
|
||||
result = tweet.time.format("MMM d")
|
||||
|
@ -141,7 +159,7 @@ proc getTwitterLink*(path: string; params: Table[string, string]): string =
|
|||
path = "/search"
|
||||
|
||||
if "/search" notin path and query.fromUser.len < 2:
|
||||
return $(twitter / path ? filterParams(params))
|
||||
return $(twitter / path)
|
||||
|
||||
let p = {
|
||||
"f": if query.kind == users: "user" else: "live",
|
||||
|
|
|
@ -1,18 +1,22 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, httpclient
|
||||
|
||||
type
|
||||
HttpPool* = ref object
|
||||
conns*: seq[AsyncHttpClient]
|
||||
|
||||
var maxConns {.threadvar.}: int
|
||||
|
||||
let keepAlive* = newHttpHeaders({
|
||||
"Connection": "Keep-Alive"
|
||||
})
|
||||
var maxConns: int
|
||||
var proxy: Proxy
|
||||
|
||||
proc setMaxHttpConns*(n: int) =
|
||||
maxConns = n
|
||||
|
||||
proc setHttpProxy*(url: string; auth: string) =
|
||||
if url.len > 0:
|
||||
proxy = newProxy(url, auth)
|
||||
else:
|
||||
proxy = nil
|
||||
|
||||
proc release*(pool: HttpPool; client: AsyncHttpClient) =
|
||||
if pool.conns.len >= maxConns:
|
||||
client.close()
|
||||
|
@ -23,7 +27,7 @@ template use*(pool: HttpPool; heads: HttpHeaders; body: untyped): untyped =
|
|||
var c {.inject.}: AsyncHttpClient
|
||||
|
||||
if pool.conns.len == 0:
|
||||
c = newAsyncHttpClient(headers=heads)
|
||||
c = newAsyncHttpClient(headers=heads, proxy=proxy)
|
||||
else:
|
||||
c = pool.conns.pop()
|
||||
c.headers = heads
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import asyncdispatch, strformat
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, strformat, logging
|
||||
from net import Port
|
||||
from htmlgen import a
|
||||
from os import getEnv
|
||||
|
@ -8,16 +9,16 @@ import jester
|
|||
import types, config, prefs, formatters, redis_cache, http_pool, tokens
|
||||
import views/[general, about]
|
||||
import routes/[
|
||||
preferences, timeline, status, media, search, rss, list,
|
||||
preferences, timeline, status, media, search, rss, list, debug,
|
||||
unsupported, embed, resolver, router_utils]
|
||||
|
||||
const instancesUrl = "https://github.com/zedeus/nitter/wiki/Instances"
|
||||
const issuesUrl = "https://github.com/zedeus/nitter/issues"
|
||||
|
||||
let configPath = getEnv("NITTER_CONF_FILE", "./nitter.conf")
|
||||
let (cfg, fullCfg) = getConfig(configPath)
|
||||
|
||||
when defined(release):
|
||||
import logging
|
||||
if not cfg.enableDebug:
|
||||
# Silence Jester's query warning
|
||||
addHandler(newConsoleLogger())
|
||||
setLogFilter(lvlError)
|
||||
|
@ -30,6 +31,7 @@ setCacheTimes(cfg)
|
|||
setHmacKey(cfg.hmacKey)
|
||||
setProxyEncoding(cfg.base64Media)
|
||||
setMaxHttpConns(cfg.httpMaxConns)
|
||||
setHttpProxy(cfg.proxy, cfg.proxyAuth)
|
||||
|
||||
waitFor initRedisPool(cfg)
|
||||
stdout.write &"Connected to Redis at {cfg.redisHost}:{cfg.redisPort}\n"
|
||||
|
@ -47,6 +49,7 @@ createSearchRouter(cfg)
|
|||
createMediaRouter(cfg)
|
||||
createEmbedRouter(cfg)
|
||||
createRssRouter(cfg)
|
||||
createDebugRouter(cfg)
|
||||
|
||||
settings:
|
||||
port = Port(cfg.port)
|
||||
|
@ -69,16 +72,22 @@ routes:
|
|||
get "/i/redirect":
|
||||
let url = decodeUrl(@"url")
|
||||
if url.len == 0: resp Http404
|
||||
redirect(replaceUrl(url, cookiePrefs()))
|
||||
redirect(replaceUrls(url, cookiePrefs()))
|
||||
|
||||
error Http404:
|
||||
resp Http404, showError("Page not found", cfg)
|
||||
|
||||
error InternalError:
|
||||
echo error.exc.name, ": ", error.exc.msg
|
||||
const link = a("open a GitHub issue", href = issuesUrl)
|
||||
resp Http500, showError(
|
||||
&"An error occurred, please {link} with the URL you tried to visit.", cfg)
|
||||
|
||||
error RateLimitError:
|
||||
echo error.exc.msg
|
||||
resp Http429, showError("Instance has been rate limited.<br>Use " &
|
||||
a("another instance", href = instancesUrl) &
|
||||
" or try again later.", cfg)
|
||||
echo error.exc.name, ": ", error.exc.msg
|
||||
const link = a("another instance", href = instancesUrl)
|
||||
resp Http429, showError(
|
||||
&"Instance has been rate limited.<br>Use {link} or try again later.", cfg)
|
||||
|
||||
extend unsupported, ""
|
||||
extend preferences, ""
|
||||
|
@ -90,3 +99,4 @@ routes:
|
|||
extend status, ""
|
||||
extend media, ""
|
||||
extend embed, ""
|
||||
extend debug, ""
|
||||
|
|
|
@ -1,5 +1,7 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, options, tables, times, math
|
||||
import packedjson
|
||||
import packedjson / deserialiser
|
||||
import types, parserutils, utils
|
||||
|
||||
proc parseProfile(js: JsonNode; id=""): Profile =
|
||||
|
@ -10,7 +12,7 @@ proc parseProfile(js: JsonNode; id=""): Profile =
|
|||
fullname: js{"name"}.getStr,
|
||||
location: js{"location"}.getStr,
|
||||
bio: js{"description"}.getStr,
|
||||
userpic: js{"profile_image_url_https"}.getImageStr.replace("_normal", ""),
|
||||
userPic: js{"profile_image_url_https"}.getImageStr.replace("_normal", ""),
|
||||
banner: js.getBanner,
|
||||
following: $js{"friends_count"}.getInt,
|
||||
followers: $js{"followers_count"}.getInt,
|
||||
|
@ -24,30 +26,21 @@ proc parseProfile(js: JsonNode; id=""): Profile =
|
|||
|
||||
result.expandProfileEntities(js)
|
||||
|
||||
proc parseUserShow*(js: JsonNode; username: string): Profile =
|
||||
if js.isNull:
|
||||
return Profile(username: username)
|
||||
proc parseUserShow*(js: JsonNode; username=""; id=""): Profile =
|
||||
if id.len > 0:
|
||||
result = Profile(id: id)
|
||||
else:
|
||||
result = Profile(username: username)
|
||||
|
||||
if js.isNull: return
|
||||
|
||||
with error, js{"errors"}:
|
||||
result = Profile(username: username)
|
||||
if error.getError == suspended:
|
||||
result.suspended = true
|
||||
return
|
||||
|
||||
result = parseProfile(js)
|
||||
|
||||
proc parseGraphProfile*(js: JsonNode; username: string): Profile =
|
||||
if js.isNull: return
|
||||
with error, js{"errors"}:
|
||||
result = Profile(username: username)
|
||||
if error.getError == suspended:
|
||||
result.suspended = true
|
||||
return
|
||||
|
||||
let user = js{"data", "user", "legacy"}
|
||||
let id = js{"data", "user", "rest_id"}.getStr
|
||||
result = parseProfile(user, id)
|
||||
|
||||
proc parseGraphList*(js: JsonNode): List =
|
||||
if js.isNull: return
|
||||
|
||||
|
@ -59,9 +52,9 @@ proc parseGraphList*(js: JsonNode): List =
|
|||
|
||||
result = List(
|
||||
id: list{"id_str"}.getStr,
|
||||
name: list{"name"}.getStr.replace(' ', '-'),
|
||||
name: list{"name"}.getStr,
|
||||
username: list{"user", "legacy", "screen_name"}.getStr,
|
||||
userId: list{"user", "legacy", "id_str"}.getStr,
|
||||
userId: list{"user", "rest_id"}.getStr,
|
||||
description: list{"description"}.getStr,
|
||||
members: list{"member_count"}.getInt,
|
||||
banner: list{"custom_banner_media", "media_info", "url"}.getImageStr
|
||||
|
@ -92,8 +85,8 @@ proc parsePoll(js: JsonNode): Poll =
|
|||
result.options.add vals{choice & "_label"}.getStrVal
|
||||
|
||||
let time = vals{"end_datetime_utc", "string_value"}.getDateTime
|
||||
if time > getTime():
|
||||
let timeLeft = $(time - getTime())
|
||||
if time > now():
|
||||
let timeLeft = $(time - now())
|
||||
result.status = timeLeft[0 ..< timeLeft.find(",")]
|
||||
else:
|
||||
result.status = "Final results"
|
||||
|
@ -114,13 +107,16 @@ proc parseVideo(js: JsonNode): Video =
|
|||
views: js{"ext", "mediaStats", "r", "ok", "viewCount"}.getStr,
|
||||
available: js{"ext_media_availability", "status"}.getStr == "available",
|
||||
title: js{"ext_alt_text"}.getStr,
|
||||
durationMs: js{"duration_millis"}.getInt
|
||||
durationMs: js{"video_info", "duration_millis"}.getInt
|
||||
# playbackType: mp4
|
||||
)
|
||||
|
||||
with title, js{"additional_media_info", "title"}:
|
||||
result.title = title.getStr
|
||||
|
||||
with description, js{"additional_media_info", "description"}:
|
||||
result.description = description.getStr
|
||||
|
||||
for v in js{"video_info", "variants"}:
|
||||
result.variants.add VideoVariant(
|
||||
videoType: parseEnum[VideoType](v{"content_type"}.getStr("summary")),
|
||||
|
@ -168,7 +164,7 @@ proc parseCard(js: JsonNode; urls: JsonNode): Card =
|
|||
let
|
||||
vals = ? js{"binding_values"}
|
||||
name = js{"name"}.getStr
|
||||
kind = parseEnum[CardKind](name[(name.find(":") + 1) ..< name.len])
|
||||
kind = parseEnum[CardKind](name[(name.find(":") + 1) ..< name.len], unknown)
|
||||
|
||||
result = Card(
|
||||
kind: kind,
|
||||
|
@ -194,7 +190,7 @@ proc parseCard(js: JsonNode; urls: JsonNode): Card =
|
|||
result.url = vals{"player_url"}.getStrVal
|
||||
if "youtube.com" in result.url:
|
||||
result.url = result.url.replace("/embed/", "/watch?v=")
|
||||
of unified:
|
||||
of audiospace, unified, unknown:
|
||||
result.title = "This card type is not supported."
|
||||
else: discard
|
||||
|
||||
|
@ -268,6 +264,20 @@ proc parseTweet(js: JsonNode): Tweet =
|
|||
result.gif = some(parseGif(m))
|
||||
else: discard
|
||||
|
||||
with jsWithheld, js{"withheld_in_countries"}:
|
||||
let withheldInCountries: seq[string] =
|
||||
if jsWithheld.kind != JArray: @[]
|
||||
else: jsWithheld.to(seq[string])
|
||||
|
||||
# XX - Content is withheld in all countries
|
||||
# XY - Content is withheld due to a DMCA request.
|
||||
if js{"withheld_copyright"}.getBool or
|
||||
withheldInCountries.len > 0 and ("XX" in withheldInCountries or
|
||||
"XY" in withheldInCountries or
|
||||
"withheld" in result.text):
|
||||
result.text.removeSuffix(" Learn more.")
|
||||
result.available = false
|
||||
|
||||
proc finalizeTweet(global: GlobalObjects; id: string): Tweet =
|
||||
let intId = if id.len > 0: parseBiggestInt(id) else: 0
|
||||
result = global.tweets.getOrDefault(id, Tweet(id: intId))
|
||||
|
@ -389,7 +399,7 @@ proc parseUsers*(js: JsonNode; after=""): Result[Profile] =
|
|||
|
||||
for e in instructions[0]{"addEntries", "entries"}:
|
||||
let entry = e{"entryId"}.getStr
|
||||
if "sq-I-u" in entry:
|
||||
if "user-" in entry:
|
||||
let id = entry.getId
|
||||
if id in global.users:
|
||||
result.content.add global.users[id]
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, times, macros, htmlgen, unicode, options, algorithm
|
||||
import regex, packedjson
|
||||
import types, utils, formatters
|
||||
|
@ -43,14 +44,14 @@ template getError*(js: JsonNode): Error =
|
|||
if js.kind != JArray or js.len == 0: null
|
||||
else: Error(js[0]{"code"}.getInt)
|
||||
|
||||
template parseTime(time: string; f: static string; flen: int): Time =
|
||||
template parseTime(time: string; f: static string; flen: int): DateTime =
|
||||
if time.len != flen: return
|
||||
parse(time, f).toTime
|
||||
parse(time, f, utc())
|
||||
|
||||
proc getDateTime*(js: JsonNode): Time =
|
||||
proc getDateTime*(js: JsonNode): DateTime =
|
||||
parseTime(js.getStr, "yyyy-MM-dd\'T\'HH:mm:ss\'Z\'", 20)
|
||||
|
||||
proc getTime*(js: JsonNode): Time =
|
||||
proc getTime*(js: JsonNode): DateTime =
|
||||
parseTime(js.getStr, "ddd MMM dd hh:mm:ss \'+0000\' yyyy", 30)
|
||||
|
||||
proc getId*(id: string): string {.inline.} =
|
||||
|
@ -117,7 +118,7 @@ proc getBanner*(js: JsonNode): string =
|
|||
if color.len > 0:
|
||||
return '#' & color
|
||||
|
||||
# use primary color from profile picture color histrogram
|
||||
# use primary color from profile picture color histogram
|
||||
with p, js{"profile_image_extensions", "mediaColor", "r", "ok", "palette"}:
|
||||
if p.len > 0:
|
||||
let pal = p[0]{"rgb"}
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import tables
|
||||
import types, prefs_impl
|
||||
from config import get
|
||||
|
@ -5,7 +6,7 @@ from parsecfg import nil
|
|||
|
||||
export genUpdatePrefs, genResetPrefs
|
||||
|
||||
var defaultPrefs* {.threadvar.}: Prefs
|
||||
var defaultPrefs*: Prefs
|
||||
|
||||
proc updateDefaultPrefs*(cfg: parsecfg.Config) =
|
||||
genDefaultPrefs()
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import macros, tables, strutils, xmltree
|
||||
|
||||
type
|
||||
|
@ -49,41 +50,12 @@ macro genPrefs*(prefDsl: untyped) =
|
|||
const `name`*: PrefList = toOrderedTable(`table`)
|
||||
|
||||
genPrefs:
|
||||
Privacy:
|
||||
replaceTwitter(input, "nitter.net"):
|
||||
"Replace Twitter links with Nitter (blank to disable)"
|
||||
placeholder: "Nitter hostname"
|
||||
|
||||
replaceYouTube(input, "piped.kavin.rocks"):
|
||||
"Replace YouTube links with Piped/Invidious (blank to disable)"
|
||||
placeholder: "Piped hostname"
|
||||
|
||||
replaceInstagram(input, ""):
|
||||
"Replace Instagram links with Bibliogram (blank to disable)"
|
||||
placeholder: "Bibliogram hostname"
|
||||
|
||||
Media:
|
||||
mp4Playback(checkbox, true):
|
||||
"Enable mp4 video playback"
|
||||
|
||||
hlsPlayback(checkbox, false):
|
||||
"Enable hls video streaming (requires JavaScript)"
|
||||
|
||||
proxyVideos(checkbox, true):
|
||||
"Proxy video streaming through the server (might be slow)"
|
||||
|
||||
muteVideos(checkbox, false):
|
||||
"Mute videos by default"
|
||||
|
||||
autoplayGifs(checkbox, true):
|
||||
"Autoplay gifs"
|
||||
|
||||
Display:
|
||||
theme(select, "Nitter"):
|
||||
"Theme"
|
||||
|
||||
infiniteScroll(checkbox, false):
|
||||
"Infinite scrolling (requires JavaScript, experimental!)"
|
||||
"Infinite scrolling (experimental, requires JavaScript)"
|
||||
|
||||
stickyProfile(checkbox, true):
|
||||
"Make profile sidebar stick to top"
|
||||
|
@ -103,6 +75,39 @@ genPrefs:
|
|||
hideReplies(checkbox, false):
|
||||
"Hide tweet replies"
|
||||
|
||||
Media:
|
||||
mp4Playback(checkbox, true):
|
||||
"Enable mp4 video playback (only for gifs)"
|
||||
|
||||
hlsPlayback(checkbox, false):
|
||||
"Enable hls video streaming (requires JavaScript)"
|
||||
|
||||
proxyVideos(checkbox, true):
|
||||
"Proxy video streaming through the server (might be slow)"
|
||||
|
||||
muteVideos(checkbox, false):
|
||||
"Mute videos by default"
|
||||
|
||||
autoplayGifs(checkbox, true):
|
||||
"Autoplay gifs"
|
||||
|
||||
"Link replacements (blank to disable)":
|
||||
replaceTwitter(input, ""):
|
||||
"Twitter -> Nitter"
|
||||
placeholder: "Nitter hostname"
|
||||
|
||||
replaceYouTube(input, ""):
|
||||
"YouTube -> Piped/Invidious"
|
||||
placeholder: "Piped hostname"
|
||||
|
||||
replaceReddit(input, ""):
|
||||
"Reddit -> Teddit/Libreddit"
|
||||
placeholder: "Teddit hostname"
|
||||
|
||||
replaceInstagram(input, ""):
|
||||
"Instagram -> Bibliogram"
|
||||
placeholder: "Bibliogram hostname"
|
||||
|
||||
iterator allPrefs*(): Pref =
|
||||
for k, v in prefList:
|
||||
for pref in v:
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, strformat, sequtils, tables, uri
|
||||
|
||||
import types
|
||||
|
@ -11,6 +12,8 @@ const
|
|||
"verified", "safe"
|
||||
]
|
||||
|
||||
emptyQuery* = "include:nativeretweets"
|
||||
|
||||
template `@`(param: string): untyped =
|
||||
if param in pms: pms[param]
|
||||
else: ""
|
||||
|
|
|
@ -1,16 +1,27 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, times, strutils, tables, hashes
|
||||
import redis, redpool, frosty, supersnappy
|
||||
import redis, redpool, flatty, supersnappy
|
||||
|
||||
import types, api
|
||||
|
||||
const redisNil = "\0\0"
|
||||
const
|
||||
redisNil = "\0\0"
|
||||
baseCacheTime = 60 * 60
|
||||
|
||||
var
|
||||
pool {.threadvar.}: RedisPool
|
||||
baseCacheTime = 60 * 60
|
||||
pool: RedisPool
|
||||
rssCacheTime: int
|
||||
listCacheTime*: int
|
||||
|
||||
# flatty can't serialize DateTime, so we need to define this
|
||||
proc toFlatty*(s: var string, x: DateTime) =
|
||||
s.toFlatty(x.toTime().toUnix())
|
||||
|
||||
proc fromFlatty*(s: string, i: var int, x: var DateTime) =
|
||||
var unix: int64
|
||||
s.fromFlatty(i, unix)
|
||||
x = fromUnix(unix).utc()
|
||||
|
||||
proc setCacheTimes*(cfg: Config) =
|
||||
rssCacheTime = cfg.rssCacheTime * 60
|
||||
listCacheTime = cfg.listCacheTime * 60
|
||||
|
@ -22,21 +33,23 @@ proc migrate*(key, match: string) {.async.} =
|
|||
let list = await r.scan(newCursor(0), match, 100000)
|
||||
r.startPipelining()
|
||||
for item in list:
|
||||
if item != "p:" or item == match:
|
||||
discard await r.del(item)
|
||||
discard await r.del(item)
|
||||
await r.setk(key, "true")
|
||||
discard await r.flushPipeline()
|
||||
|
||||
proc initRedisPool*(cfg: Config) {.async.} =
|
||||
try:
|
||||
pool = await newRedisPool(cfg.redisConns, maxConns=cfg.redisMaxConns,
|
||||
host=cfg.redisHost, port=cfg.redisPort)
|
||||
pool = await newRedisPool(cfg.redisConns, cfg.redisMaxConns,
|
||||
host=cfg.redisHost, port=cfg.redisPort,
|
||||
password=cfg.redisPassword)
|
||||
|
||||
await migrate("flatty", "*:*")
|
||||
await migrate("snappyRss", "rss:*")
|
||||
await migrate("oldFrosty", "*")
|
||||
await migrate("userBuckets", "p:")
|
||||
await migrate("userBuckets", "p:*")
|
||||
await migrate("profileDates", "p:*")
|
||||
|
||||
pool.withAcquire(r):
|
||||
# optimize memory usage for profile ID buckets
|
||||
await r.configSet("hash-max-ziplist-entries", "1000")
|
||||
|
||||
except OSError:
|
||||
|
@ -46,87 +59,97 @@ proc initRedisPool*(cfg: Config) {.async.} =
|
|||
|
||||
template pidKey(name: string): string = "pid:" & $(hash(name) div 1_000_000)
|
||||
template profileKey(name: string): string = "p:" & name
|
||||
template listKey(l: List): string = toLower("l:" & l.username & '/' & l.name)
|
||||
template listKey(l: List): string = "l:" & l.id
|
||||
|
||||
proc get(query: string): Future[string] {.async.} =
|
||||
pool.withAcquire(r):
|
||||
result = await r.get(query)
|
||||
|
||||
proc setex(key: string; time: int; data: string) {.async.} =
|
||||
proc setEx(key: string; time: int; data: string) {.async.} =
|
||||
pool.withAcquire(r):
|
||||
discard await r.setex(key, time, data)
|
||||
discard await r.setEx(key, time, data)
|
||||
|
||||
proc cache*(data: List) {.async.} =
|
||||
await setex(data.listKey, listCacheTime, compress(freeze(data)))
|
||||
await setEx(data.listKey, listCacheTime, compress(toFlatty(data)))
|
||||
|
||||
proc cache*(data: PhotoRail; name: string) {.async.} =
|
||||
await setex("pr:" & name, baseCacheTime, compress(freeze(data)))
|
||||
await setEx("pr:" & toLower(name), baseCacheTime, compress(toFlatty(data)))
|
||||
|
||||
proc cache*(data: Profile) {.async.} =
|
||||
if data.username.len == 0 or data.id.len == 0: return
|
||||
let name = toLower(data.username)
|
||||
pool.withAcquire(r):
|
||||
r.startPipelining()
|
||||
discard await r.setex(name.profileKey, baseCacheTime, compress(freeze(data)))
|
||||
discard await r.hset(name.pidKey, name, data.id)
|
||||
discard await r.setEx(name.profileKey, baseCacheTime, compress(toFlatty(data)))
|
||||
discard await r.setEx("i:" & data.id , baseCacheTime, data.username)
|
||||
discard await r.hSet(name.pidKey, name, data.id)
|
||||
discard await r.flushPipeline()
|
||||
|
||||
proc cacheProfileId*(username, id: string) {.async.} =
|
||||
if username.len == 0 or id.len == 0: return
|
||||
let name = toLower(username)
|
||||
pool.withAcquire(r):
|
||||
discard await r.hset(name.pidKey, name, id)
|
||||
discard await r.hSet(name.pidKey, name, id)
|
||||
|
||||
proc cacheRss*(query: string; rss: Rss) {.async.} =
|
||||
let key = "rss:" & query
|
||||
pool.withAcquire(r):
|
||||
r.startPipelining()
|
||||
discard await r.hset(key, "rss", rss.feed)
|
||||
discard await r.hset(key, "min", rss.cursor)
|
||||
discard await r.hSet(key, "rss", rss.feed)
|
||||
discard await r.hSet(key, "min", rss.cursor)
|
||||
discard await r.expire(key, rssCacheTime)
|
||||
discard await r.flushPipeline()
|
||||
|
||||
proc getProfileId*(username: string): Future[string] {.async.} =
|
||||
let name = toLower(username)
|
||||
pool.withAcquire(r):
|
||||
result = await r.hget(name.pidKey, name)
|
||||
result = await r.hGet(name.pidKey, name)
|
||||
if result == redisNil:
|
||||
result.setLen(0)
|
||||
|
||||
proc getCachedProfile*(username: string; fetch=true): Future[Profile] {.async.} =
|
||||
let prof = await get("p:" & toLower(username))
|
||||
if prof != redisNil:
|
||||
uncompress(prof).thaw(result)
|
||||
result = fromFlatty(uncompress(prof), Profile)
|
||||
elif fetch:
|
||||
result = await getProfile(username)
|
||||
|
||||
proc getCachedProfileUsername*(userId: string): Future[string] {.async.} =
|
||||
let username = await get("i:" & userId)
|
||||
if username != redisNil:
|
||||
result = username
|
||||
else:
|
||||
let profile = await getProfileById(userId)
|
||||
result = profile.username
|
||||
await cache(profile)
|
||||
|
||||
proc getCachedPhotoRail*(name: string): Future[PhotoRail] {.async.} =
|
||||
if name.len == 0: return
|
||||
let rail = await get("pr:" & toLower(name))
|
||||
if rail != redisNil:
|
||||
uncompress(rail).thaw(result)
|
||||
result = fromFlatty(uncompress(rail), PhotoRail)
|
||||
else:
|
||||
result = await getPhotoRail(name)
|
||||
await cache(result, name)
|
||||
|
||||
proc getCachedList*(username=""; name=""; id=""): Future[List] {.async.} =
|
||||
let list = if id.len > 0: redisNil
|
||||
else: await get(toLower("l:" & username & '/' & name))
|
||||
proc getCachedList*(username=""; slug=""; id=""): Future[List] {.async.} =
|
||||
let list = if id.len == 0: redisNil
|
||||
else: await get("l:" & id)
|
||||
|
||||
if list != redisNil:
|
||||
uncompress(list).thaw(result)
|
||||
result = fromFlatty(uncompress(list), List)
|
||||
else:
|
||||
if id.len > 0:
|
||||
result = await getGraphListById(id)
|
||||
result = await getGraphList(id)
|
||||
else:
|
||||
result = await getGraphList(username, name)
|
||||
result = await getGraphListBySlug(username, slug)
|
||||
await cache(result)
|
||||
|
||||
proc getCachedRss*(key: string): Future[Rss] {.async.} =
|
||||
let k = "rss:" & key
|
||||
pool.withAcquire(r):
|
||||
result.cursor = await r.hget(k, "min")
|
||||
result.cursor = await r.hGet(k, "min")
|
||||
if result.cursor.len > 2:
|
||||
result.feed = await r.hget(k, "rss")
|
||||
result.feed = await r.hGet(k, "rss")
|
||||
else:
|
||||
result.cursor.setLen 0
|
||||
|
|
|
@ -0,0 +1,10 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import jester
|
||||
import router_utils
|
||||
import ".."/[tokens, types]
|
||||
|
||||
proc createDebugRouter*(cfg: Config) =
|
||||
router debug:
|
||||
get "/.tokens":
|
||||
cond cfg.enableDebug
|
||||
respJson getPoolJson()
|
|
@ -1,8 +1,9 @@
|
|||
import asyncdispatch, strutils, sequtils, uri, options
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, strutils, options
|
||||
import jester
|
||||
import ".."/[types, api], ../views/embed
|
||||
|
||||
export embed
|
||||
export api, embed
|
||||
|
||||
proc createEmbedRouter*(cfg: Config) =
|
||||
router embed:
|
||||
|
|
|
@ -1,47 +1,51 @@
|
|||
import strutils
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, uri
|
||||
|
||||
import jester
|
||||
|
||||
import router_utils
|
||||
import ".."/[query, types, redis_cache, api]
|
||||
import ".."/[types, redis_cache, api]
|
||||
import ../views/[general, timeline, list]
|
||||
export getListTimeline, getGraphList
|
||||
|
||||
template respList*(list, timeline, vnode: typed) =
|
||||
if list.id.len == 0:
|
||||
resp Http404, showError("List \"" & @"list" & "\" not found", cfg)
|
||||
template respList*(list, timeline, title, vnode: typed) =
|
||||
if list.id.len == 0 or list.name.len == 0:
|
||||
resp Http404, showError("List " & @"id" & " not found", cfg)
|
||||
|
||||
let
|
||||
html = renderList(vnode, timeline.query, list)
|
||||
rss = "/$1/lists/$2/rss" % [@"name", @"list"]
|
||||
rss = "/i/lists/$1/rss" % [@"id"]
|
||||
|
||||
resp renderMain(html, request, cfg, prefs, rss=rss, banner=list.banner)
|
||||
resp renderMain(html, request, cfg, prefs, titleText=title, rss=rss, banner=list.banner)
|
||||
|
||||
proc createListRouter*(cfg: Config) =
|
||||
router list:
|
||||
get "/@name/lists/@list/?":
|
||||
get "/@name/lists/@slug/?":
|
||||
cond '.' notin @"name"
|
||||
cond @"name" != "i"
|
||||
cond @"slug" != "memberships"
|
||||
let
|
||||
prefs = cookiePrefs()
|
||||
list = await getCachedList(@"name", @"list")
|
||||
timeline = await getListTimeline(list.id, getCursor())
|
||||
vnode = renderTimelineTweets(timeline, prefs, request.path)
|
||||
respList(list, timeline, vnode)
|
||||
|
||||
get "/@name/lists/@list/members":
|
||||
cond '.' notin @"name"
|
||||
cond @"name" != "i"
|
||||
let
|
||||
prefs = cookiePrefs()
|
||||
list = await getCachedList(@"name", @"list")
|
||||
members = await getListMembers(list, getCursor())
|
||||
respList(list, members, renderTimelineUsers(members, prefs, request.path))
|
||||
slug = decodeUrl(@"slug")
|
||||
list = await getCachedList(@"name", slug)
|
||||
if list.id.len == 0:
|
||||
resp Http404, showError("List \"" & @"slug" & "\" not found", cfg)
|
||||
redirect("/i/lists/" & list.id)
|
||||
|
||||
get "/i/lists/@id/?":
|
||||
cond '.' notin @"id"
|
||||
let list = await getCachedList(id=(@"id"))
|
||||
if list.id.len == 0:
|
||||
resp Http404
|
||||
await cache(list)
|
||||
redirect("/" & list.username & "/lists/" & list.name)
|
||||
let
|
||||
prefs = cookiePrefs()
|
||||
list = await getCachedList(id=(@"id"))
|
||||
title = "@" & list.username & "/" & list.name
|
||||
timeline = await getListTimeline(list.id, getCursor())
|
||||
vnode = renderTimelineTweets(timeline, prefs, request.path)
|
||||
respList(list, timeline, title, vnode)
|
||||
|
||||
get "/i/lists/@id/members":
|
||||
cond '.' notin @"id"
|
||||
let
|
||||
prefs = cookiePrefs()
|
||||
list = await getCachedList(id=(@"id"))
|
||||
title = "@" & list.username & "/" & list.name
|
||||
members = await getListMembers(list, getCursor())
|
||||
respList(list, members, title, renderTimelineUsers(members, prefs, request.path))
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import uri, strutils, httpclient, os, hashes, base64, re
|
||||
import asynchttpserver, asyncstreams, asyncfile, asyncnet
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, uri, os, algorithm
|
||||
|
||||
import jester
|
||||
|
@ -31,7 +32,7 @@ proc createPrefRouter*(cfg: Config) =
|
|||
|
||||
post "/resetprefs":
|
||||
genResetPrefs()
|
||||
redirect($(parseUri("/settings") ? filterParams(request.params)))
|
||||
redirect("/settings?referer=" & encodeUrl(refPath()))
|
||||
|
||||
post "/enablehls":
|
||||
savePref("hlsPlayback", "on", request)
|
||||
|
|
|
@ -1,9 +1,10 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils
|
||||
|
||||
import jester
|
||||
|
||||
import router_utils
|
||||
import ".."/[query, types, api]
|
||||
import ".."/[types, api]
|
||||
import ../views/general
|
||||
|
||||
template respResolved*(url, kind: string): untyped =
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
import strutils, sequtils, uri, tables
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, sequtils, uri, tables, json
|
||||
from jester import Request, cookies
|
||||
|
||||
import ../views/general
|
||||
|
@ -41,3 +42,6 @@ template getCursor*(req: Request): string =
|
|||
|
||||
proc getNames*(name: string): seq[string] =
|
||||
name.strip(chars={'/'}).split(",").filterIt(it.len > 0)
|
||||
|
||||
template respJson*(node: JsonNode) =
|
||||
resp $node, "application/json"
|
||||
|
|
|
@ -1,6 +1,7 @@
|
|||
import asyncdispatch, strutils, tables, times, sequtils, hashes, supersnappy
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, strutils, tables, times, hashes, uri
|
||||
|
||||
import jester
|
||||
import jester, supersnappy
|
||||
|
||||
import router_utils, timeline
|
||||
import ../query
|
||||
|
@ -35,12 +36,18 @@ proc timelineRss*(req: Request; cfg: Config; query: Query): Future[Rss] {.async.
|
|||
return Rss(feed: profile.username, cursor: "suspended")
|
||||
|
||||
if profile.fullname.len > 0:
|
||||
let rss = compress renderTimelineRss(timeline, profile, cfg, multi=(names.len > 1))
|
||||
let rss = compress renderTimelineRss(timeline, profile, cfg,
|
||||
multi=(names.len > 1))
|
||||
return Rss(feed: rss, cursor: timeline.bottom)
|
||||
|
||||
template respRss*(rss) =
|
||||
template respRss*(rss, page) =
|
||||
if rss.cursor.len == 0:
|
||||
resp Http404, showError("User \"" & @"name" & "\" not found", cfg)
|
||||
let info = case page
|
||||
of "User": " \"$1\" " % @"name"
|
||||
of "List": " $1 " % @"id"
|
||||
else: " "
|
||||
|
||||
resp Http404, showError(page & info & "not found", cfg)
|
||||
elif rss.cursor.len == 9 and rss.cursor == "suspended":
|
||||
resp Http404, showError(getSuspended(rss.feed), cfg)
|
||||
|
||||
|
@ -51,6 +58,7 @@ template respRss*(rss) =
|
|||
proc createRssRouter*(cfg: Config) =
|
||||
router rss:
|
||||
get "/search/rss":
|
||||
cond cfg.enableRss
|
||||
if @"q".len > 200:
|
||||
resp Http400, showError("Search input too long.", cfg)
|
||||
|
||||
|
@ -60,11 +68,11 @@ proc createRssRouter*(cfg: Config) =
|
|||
|
||||
let
|
||||
cursor = getCursor()
|
||||
key = $hash(genQueryUrl(query)) & cursor
|
||||
key = "search:" & $hash(genQueryUrl(query)) & ":" & cursor
|
||||
|
||||
var rss = await getCachedRss(key)
|
||||
if rss.cursor.len > 0:
|
||||
respRss(rss)
|
||||
respRss(rss, "Search")
|
||||
|
||||
let tweets = await getSearch[Tweet](query, cursor)
|
||||
rss.cursor = tweets.bottom
|
||||
|
@ -72,25 +80,27 @@ proc createRssRouter*(cfg: Config) =
|
|||
genQueryUrl(query), cfg)
|
||||
|
||||
await cacheRss(key, rss)
|
||||
respRss(rss)
|
||||
respRss(rss, "Search")
|
||||
|
||||
get "/@name/rss":
|
||||
cond cfg.enableRss
|
||||
cond '.' notin @"name"
|
||||
let
|
||||
cursor = getCursor()
|
||||
name = @"name"
|
||||
key = name & cursor
|
||||
key = "twitter:" & name & ":" & cursor
|
||||
|
||||
var rss = await getCachedRss(key)
|
||||
if rss.cursor.len > 0:
|
||||
respRss(rss)
|
||||
respRss(rss, "User")
|
||||
|
||||
rss = await timelineRss(request, cfg, Query(fromUser: @[name]))
|
||||
|
||||
await cacheRss(key, rss)
|
||||
respRss(rss)
|
||||
respRss(rss, "User")
|
||||
|
||||
get "/@name/@tab/rss":
|
||||
cond cfg.enableRss
|
||||
cond '.' notin @"name"
|
||||
cond @"tab" in ["with_replies", "media", "search"]
|
||||
let name = @"name"
|
||||
|
@ -101,38 +111,57 @@ proc createRssRouter*(cfg: Config) =
|
|||
of "search": initQuery(params(request), name=name)
|
||||
else: Query(fromUser: @[name])
|
||||
|
||||
var key = @"name" & "/" & @"tab"
|
||||
var key = @"tab" & ":" & @"name" & ":"
|
||||
if @"tab" == "search":
|
||||
key &= $hash(genQueryUrl(query))
|
||||
key &= $hash(genQueryUrl(query)) & ":"
|
||||
key &= getCursor()
|
||||
|
||||
var rss = await getCachedRss(key)
|
||||
if rss.cursor.len > 0:
|
||||
respRss(rss)
|
||||
respRss(rss, "User")
|
||||
|
||||
rss = await timelineRss(request, cfg, query)
|
||||
|
||||
await cacheRss(key, rss)
|
||||
respRss(rss)
|
||||
respRss(rss, "User")
|
||||
|
||||
get "/@name/lists/@list/rss":
|
||||
cond '.' notin @"name"
|
||||
get "/@name/lists/@slug/rss":
|
||||
cond cfg.enableRss
|
||||
cond @"name" != "i"
|
||||
let
|
||||
slug = decodeUrl(@"slug")
|
||||
list = await getCachedList(@"name", slug)
|
||||
cursor = getCursor()
|
||||
|
||||
if list.id.len == 0:
|
||||
resp Http404, showError("List \"" & @"slug" & "\" not found", cfg)
|
||||
|
||||
let url = "/i/lists/" & list.id & "/rss"
|
||||
if cursor.len > 0:
|
||||
redirect(url & "?cursor=" & encodeUrl(cursor, false))
|
||||
else:
|
||||
redirect(url)
|
||||
|
||||
get "/i/lists/@id/rss":
|
||||
cond cfg.enableRss
|
||||
let
|
||||
cursor = getCursor()
|
||||
key = @"name" & "/" & @"list" & cursor
|
||||
key =
|
||||
if cursor.len == 0: "lists:" & @"id"
|
||||
else: "lists:" & @"id" & ":" & cursor
|
||||
|
||||
var rss = await getCachedRss(key)
|
||||
if rss.cursor.len > 0:
|
||||
respRss(rss)
|
||||
respRss(rss, "List")
|
||||
|
||||
let
|
||||
list = await getCachedList(@"name", @"list")
|
||||
list = await getCachedList(id=(@"id"))
|
||||
timeline = await getListTimeline(list.id, cursor)
|
||||
rss.cursor = timeline.bottom
|
||||
rss.feed = compress renderListRss(timeline.content, list, cfg)
|
||||
|
||||
await cacheRss(key, rss)
|
||||
respRss(rss)
|
||||
respRss(rss, "List")
|
||||
|
||||
get "/@name/status/@id/rss":
|
||||
cond '.' notin @"name"
|
||||
|
@ -143,7 +172,7 @@ proc createRssRouter*(cfg: Config) =
|
|||
|
||||
var rss = await getCachedRss(key)
|
||||
if rss.cursor.len > 0:
|
||||
respRss(rss)
|
||||
respRss(rss, "Tweet")
|
||||
|
||||
var conv = await getTweet(id)
|
||||
if conv == nil or conv.tweet == nil or conv.tweet.id == 0:
|
||||
|
@ -163,4 +192,4 @@ proc createRssRouter*(cfg: Config) =
|
|||
rss = Rss(feed: feed, cursor: $lastThreadTweets[0].id)
|
||||
|
||||
await cacheRss(key, rss)
|
||||
respRss(rss)
|
||||
respRss(rss, "Tweet")
|
||||
|
|
|
@ -1,9 +1,10 @@
|
|||
import strutils, sequtils, uri
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, uri
|
||||
|
||||
import jester
|
||||
|
||||
import router_utils
|
||||
import ".."/[query, types, api]
|
||||
import ".."/[query, types, api, formatters]
|
||||
import ../views/[general, search]
|
||||
|
||||
include "../views/opensearch.nimf"
|
||||
|
@ -39,7 +40,6 @@ proc createSearchRouter*(cfg: Config) =
|
|||
redirect("/search?q=" & encodeUrl("#" & @"hash"))
|
||||
|
||||
get "/opensearch":
|
||||
var url = if cfg.useHttps: "https://" else: "http://"
|
||||
url &= cfg.hostname & "/search?q="
|
||||
let url = getUrlPrefix(cfg) & "/search?q="
|
||||
resp Http200, {"Content-Type": "application/opensearchdescription+xml"},
|
||||
generateOpenSearchXML(cfg.title, cfg.hostname, url)
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, strutils, sequtils, uri, options
|
||||
|
||||
import jester, karax/vdom
|
||||
|
@ -17,6 +18,7 @@ proc createStatusRouter*(cfg: Config) =
|
|||
cond '.' notin @"name"
|
||||
let prefs = cookiePrefs()
|
||||
|
||||
# used for the infinite scroll feature
|
||||
if @"scroll".len > 0:
|
||||
let replies = await getReplies(@"id", getCursor())
|
||||
if replies.content.len == 0:
|
||||
|
@ -33,10 +35,12 @@ proc createStatusRouter*(cfg: Config) =
|
|||
error = conv.tweet.tombstone
|
||||
resp Http404, showError(error, cfg)
|
||||
|
||||
var
|
||||
let
|
||||
title = pageTitle(conv.tweet)
|
||||
ogTitle = pageTitle(conv.tweet.profile)
|
||||
desc = conv.tweet.text
|
||||
|
||||
var
|
||||
images = conv.tweet.photos
|
||||
video = ""
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, strutils, sequtils, uri, options, times
|
||||
import jester, karax/vdom
|
||||
|
||||
|
@ -45,7 +46,7 @@ proc fetchSingleTimeline*(after: string; query: Query; skipRail=false):
|
|||
return
|
||||
|
||||
var rail: Future[PhotoRail]
|
||||
if skipRail or query.kind == media:
|
||||
if skipRail or profile.protected or query.kind == media:
|
||||
rail = newFuture[PhotoRail]()
|
||||
rail.complete(@[])
|
||||
else:
|
||||
|
@ -77,9 +78,6 @@ proc fetchSingleTimeline*(after: string; query: Query; skipRail=false):
|
|||
|
||||
return (profile, timeline, await rail)
|
||||
|
||||
proc get*(req: Request; key: string): string =
|
||||
params(req).getOrDefault(key)
|
||||
|
||||
proc showTimeline*(request: Request; query: Query; cfg: Config; prefs: Prefs;
|
||||
rss, after: string): Future[string] {.async.} =
|
||||
if query.fromUser.len != 1:
|
||||
|
@ -95,7 +93,7 @@ proc showTimeline*(request: Request; query: Query; cfg: Config; prefs: Prefs;
|
|||
|
||||
let pHtml = renderProfile(p, t, r, prefs, getPath())
|
||||
result = renderMain(pHtml, request, cfg, prefs, pageTitle(p), pageDesc(p),
|
||||
rss=rss, images = @[p.getUserpic("_400x400")],
|
||||
rss=rss, images = @[p.getUserPic("_400x400")],
|
||||
banner=p.banner)
|
||||
|
||||
template respTimeline*(timeline: typed) =
|
||||
|
@ -104,8 +102,22 @@ template respTimeline*(timeline: typed) =
|
|||
resp Http404, showError("User \"" & @"name" & "\" not found", cfg)
|
||||
resp t
|
||||
|
||||
template respUserId*() =
|
||||
cond @"user_id".len > 0
|
||||
let username = await getCachedProfileUsername(@"user_id")
|
||||
if username.len > 0:
|
||||
redirect("/" & username)
|
||||
else:
|
||||
resp Http404, showError("User not found", cfg)
|
||||
|
||||
proc createTimelineRouter*(cfg: Config) =
|
||||
router timeline:
|
||||
get "/i/user/@user_id":
|
||||
respUserId()
|
||||
|
||||
get "/intent/user":
|
||||
respUserId()
|
||||
|
||||
get "/@name/?@tab?/?":
|
||||
cond '.' notin @"name"
|
||||
cond @"name" notin ["pic", "gif", "video"]
|
||||
|
@ -119,6 +131,7 @@ proc createTimelineRouter*(cfg: Config) =
|
|||
if names.len != 1:
|
||||
query.fromUser = names
|
||||
|
||||
# used for the infinite scroll feature
|
||||
if @"scroll".len > 0:
|
||||
if query.fromUser.len != 1:
|
||||
var timeline = await getSearch[Tweet](query, after)
|
||||
|
@ -131,10 +144,12 @@ proc createTimelineRouter*(cfg: Config) =
|
|||
timeline.beginning = true
|
||||
resp $renderTimelineTweets(timeline, prefs, getPath())
|
||||
|
||||
var rss = "/$1/$2/rss" % [@"name", @"tab"]
|
||||
if @"tab".len == 0:
|
||||
rss = "/$1/rss" % @"name"
|
||||
elif @"tab" == "search":
|
||||
rss &= "?" & genQueryUrl(query)
|
||||
let rss =
|
||||
if @"tab".len == 0:
|
||||
"/$1/rss" % @"name"
|
||||
elif @"tab" == "search":
|
||||
"/$1/search/rss?$2" % [@"name", genQueryUrl(query)]
|
||||
else:
|
||||
"/$1/$2/rss" % [@"name", @"tab"]
|
||||
|
||||
respTimeline(await showTimeline(request, query, cfg, prefs, rss, after))
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import jester
|
||||
|
||||
import router_utils
|
||||
|
@ -10,10 +11,13 @@ proc createUnsupportedRouter*(cfg: Config) =
|
|||
resp renderMain(renderFeature(), request, cfg, themePrefs())
|
||||
|
||||
get "/about/feature": feature()
|
||||
get "/intent/?@i?": feature()
|
||||
get "/login/?@i?": feature()
|
||||
get "/@name/lists/?": feature()
|
||||
|
||||
get "/i/@i?/?@j?":
|
||||
cond @"i" notin ["status", "lists"]
|
||||
get "/intent/?@i?":
|
||||
cond @"i" notin ["user"]
|
||||
feature()
|
||||
|
||||
get "/i/@i?/?@j?":
|
||||
cond @"i" notin ["status", "lists" , "user"]
|
||||
feature()
|
||||
|
|
|
@ -91,11 +91,12 @@ legend {
|
|||
padding: .6em 0 .3em 0;
|
||||
border: 0;
|
||||
font-size: 16px;
|
||||
font-weight: 600;
|
||||
border-bottom: 1px solid var(--border_grey);
|
||||
margin-bottom: 8px;
|
||||
}
|
||||
|
||||
.cookie-note {
|
||||
.preferences .note {
|
||||
border-top: 1px solid var(--border_grey);
|
||||
border-bottom: 1px solid var(--border_grey);
|
||||
padding: 6px 0 8px 0;
|
||||
|
|
|
@ -51,9 +51,10 @@
|
|||
margin-bottom: unset;
|
||||
}
|
||||
|
||||
@media(max-width: 600px) {
|
||||
@media(max-width: 700px) {
|
||||
.profile-tabs {
|
||||
width: 100vw;
|
||||
max-width: 600px;
|
||||
|
||||
.timeline-container {
|
||||
width: 100% !important;
|
||||
|
@ -71,3 +72,9 @@
|
|||
padding: 0;
|
||||
}
|
||||
}
|
||||
|
||||
@media (min-height: 900px) {
|
||||
.profile-tab.sticky {
|
||||
position: sticky;
|
||||
}
|
||||
}
|
||||
|
|
|
@ -103,7 +103,7 @@
|
|||
color: var(--profile_stat);
|
||||
}
|
||||
|
||||
@media(max-width: 600px) {
|
||||
@media(max-width: 700px) {
|
||||
.profile-card-info {
|
||||
display: flex;
|
||||
}
|
||||
|
|
|
@ -59,7 +59,7 @@
|
|||
padding-bottom: 12px;
|
||||
}
|
||||
|
||||
@media(max-width: 600px) {
|
||||
@media(max-width: 700px) {
|
||||
.photo-rail-header {
|
||||
display: none;
|
||||
}
|
||||
|
@ -74,9 +74,7 @@
|
|||
overflow: hidden;
|
||||
transition: max-height 0.4s;
|
||||
}
|
||||
}
|
||||
|
||||
@media(max-width: 600px) {
|
||||
.photo-rail-grid {
|
||||
grid-template-columns: repeat(6, 1fr);
|
||||
}
|
||||
|
|
|
@ -75,6 +75,7 @@
|
|||
|
||||
&.wide {
|
||||
flex-grow: 1.2;
|
||||
flex-basis: 50px;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
@ -174,7 +174,8 @@
|
|||
|
||||
.tweet-stat {
|
||||
padding-top: 5px;
|
||||
padding-right: 8px;
|
||||
min-width: 1em;
|
||||
margin-right: 0.8em;
|
||||
}
|
||||
|
||||
.show-thread {
|
||||
|
@ -188,8 +189,10 @@
|
|||
height: 100%;
|
||||
padding: 12px;
|
||||
border: solid 1px var(--dark_grey);
|
||||
box-sizing: border-box;
|
||||
border-radius: 10px;
|
||||
background-color: var(--bg_color);
|
||||
z-index: 2;
|
||||
}
|
||||
|
||||
.tweet-link {
|
||||
|
|
|
@ -37,7 +37,7 @@
|
|||
@include ellipsis;
|
||||
white-space: unset;
|
||||
font-weight: bold;
|
||||
font-size: 1.15em;
|
||||
font-size: 1.1em;
|
||||
}
|
||||
|
||||
.card-description {
|
||||
|
|
|
@ -2,8 +2,8 @@
|
|||
@import '_mixins';
|
||||
|
||||
video {
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
max-height: 100%;
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
.gallery-video {
|
||||
|
@ -18,10 +18,13 @@ video {
|
|||
.video-container {
|
||||
max-height: 530px;
|
||||
margin: 0;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
|
||||
img {
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
max-height: 100%;
|
||||
max-width: 100%;
|
||||
}
|
||||
}
|
||||
|
||||
|
|
127
src/tokens.nim
127
src/tokens.nim
|
@ -1,26 +1,60 @@
|
|||
import asyncdispatch, httpclient, times, sequtils, json, math, random
|
||||
import strutils, strformat
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import asyncdispatch, httpclient, times, sequtils, json, random
|
||||
import strutils, tables
|
||||
import zippy
|
||||
import types, agents, consts, http_pool
|
||||
|
||||
const
|
||||
expirationTime = 3.hours
|
||||
maxLastUse = 1.hours
|
||||
resetPeriod = 15.minutes
|
||||
maxConcurrentReqs = 5 # max requests at a time per token, to avoid race conditions
|
||||
maxAge = 3.hours # tokens expire after 3 hours
|
||||
maxLastUse = 1.hours # if a token is unused for 60 minutes, it expires
|
||||
failDelay = initDuration(minutes=30)
|
||||
|
||||
var
|
||||
clientPool {.threadvar.}: HttpPool
|
||||
tokenPool {.threadvar.}: seq[Token]
|
||||
clientPool: HttpPool
|
||||
tokenPool: seq[Token]
|
||||
lastFailed: Time
|
||||
|
||||
proc getPoolInfo*: string =
|
||||
if tokenPool.len == 0: return "token pool empty"
|
||||
proc getPoolJson*(): JsonNode =
|
||||
var
|
||||
list = newJObject()
|
||||
totalReqs = 0
|
||||
totalPending = 0
|
||||
reqsPerApi: Table[string, int]
|
||||
|
||||
let avg = tokenPool.mapIt(it.remaining).sum() div tokenPool.len
|
||||
return &"{tokenPool.len} tokens, average remaining: {avg}"
|
||||
for token in tokenPool:
|
||||
totalPending.inc(token.pending)
|
||||
list[token.tok] = %*{
|
||||
"apis": newJObject(),
|
||||
"pending": token.pending,
|
||||
"init": $token.init,
|
||||
"lastUse": $token.lastUse
|
||||
}
|
||||
|
||||
for api in token.apis.keys:
|
||||
list[token.tok]["apis"][$api] = %token.apis[api]
|
||||
|
||||
let
|
||||
maxReqs =
|
||||
case api
|
||||
of Api.listBySlug, Api.list: 500
|
||||
of Api.timeline: 187
|
||||
else: 180
|
||||
reqs = maxReqs - token.apis[api].remaining
|
||||
|
||||
reqsPerApi[$api] = reqsPerApi.getOrDefault($api, 0) + reqs
|
||||
totalReqs.inc(reqs)
|
||||
|
||||
return %*{
|
||||
"amount": tokenPool.len,
|
||||
"requests": totalReqs,
|
||||
"pending": totalPending,
|
||||
"apis": reqsPerApi,
|
||||
"tokens": list
|
||||
}
|
||||
|
||||
proc rateLimitError*(): ref RateLimitError =
|
||||
newException(RateLimitError, "rate limited with " & getPoolInfo())
|
||||
newException(RateLimitError, "rate limited")
|
||||
|
||||
proc fetchToken(): Future[Token] {.async.} =
|
||||
if getTime() - lastFailed < failDelay:
|
||||
|
@ -28,58 +62,75 @@ proc fetchToken(): Future[Token] {.async.} =
|
|||
|
||||
let headers = newHttpHeaders({
|
||||
"accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",
|
||||
"accept-encoding": "gzip",
|
||||
"accept-language": "en-US,en;q=0.5",
|
||||
"connection": "keep-alive",
|
||||
"user-agent": getAgent(),
|
||||
"authorization": auth
|
||||
})
|
||||
|
||||
var
|
||||
resp: string
|
||||
tokNode: JsonNode
|
||||
tok: string
|
||||
|
||||
try:
|
||||
resp = clientPool.use(headers): await c.postContent(activate)
|
||||
tokNode = parseJson(resp)["guest_token"]
|
||||
tok = tokNode.getStr($(tokNode.getInt))
|
||||
let
|
||||
resp = clientPool.use(headers): await c.postContent(activate)
|
||||
tokNode = parseJson(uncompress(resp))["guest_token"]
|
||||
tok = tokNode.getStr($(tokNode.getInt))
|
||||
time = getTime()
|
||||
|
||||
let time = getTime()
|
||||
result = Token(tok: tok, remaining: 187, reset: time + resetPeriod,
|
||||
init: time, lastUse: time)
|
||||
return Token(tok: tok, init: time, lastUse: time)
|
||||
except Exception as e:
|
||||
lastFailed = getTime()
|
||||
echo "fetching token failed: ", e.msg
|
||||
|
||||
template expired(token: Token): untyped =
|
||||
proc expired(token: Token): bool =
|
||||
let time = getTime()
|
||||
token.init < time - expirationTime or
|
||||
token.lastUse < time - maxLastUse
|
||||
token.init < time - maxAge or token.lastUse < time - maxLastUse
|
||||
|
||||
template isLimited(token: Token): untyped =
|
||||
token == nil or (token.remaining <= 1 and token.reset > getTime()) or
|
||||
token.expired
|
||||
proc isLimited(token: Token; api: Api): bool =
|
||||
if token.isNil or token.expired:
|
||||
return true
|
||||
|
||||
proc release*(token: Token; invalid=false) =
|
||||
if token != nil and (invalid or token.expired):
|
||||
if api in token.apis:
|
||||
let limit = token.apis[api]
|
||||
return (limit.remaining <= 10 and limit.reset > epochTime().int)
|
||||
else:
|
||||
return false
|
||||
|
||||
proc isReady(token: Token; api: Api): bool =
|
||||
not (token.isNil or token.pending > maxConcurrentReqs or token.isLimited(api))
|
||||
|
||||
proc release*(token: Token; used=false; invalid=false) =
|
||||
if token.isNil: return
|
||||
if invalid or token.expired:
|
||||
let idx = tokenPool.find(token)
|
||||
if idx > -1: tokenPool.delete(idx)
|
||||
elif used:
|
||||
dec token.pending
|
||||
token.lastUse = getTime()
|
||||
|
||||
proc getToken*(): Future[Token] {.async.} =
|
||||
proc getToken*(api: Api): Future[Token] {.async.} =
|
||||
for i in 0 ..< tokenPool.len:
|
||||
if not result.isLimited: break
|
||||
if result.isReady(api): break
|
||||
release(result)
|
||||
result = tokenPool.sample()
|
||||
|
||||
if result.isLimited:
|
||||
if not result.isReady(api):
|
||||
release(result)
|
||||
result = await fetchToken()
|
||||
tokenPool.add result
|
||||
|
||||
if result == nil:
|
||||
if not result.isNil:
|
||||
inc result.pending
|
||||
else:
|
||||
raise rateLimitError()
|
||||
|
||||
dec result.remaining
|
||||
proc setRateLimit*(token: Token; api: Api; remaining, reset: int) =
|
||||
# avoid undefined behavior in race conditions
|
||||
if api in token.apis:
|
||||
let limit = token.apis[api]
|
||||
if limit.reset >= reset and limit.remaining < remaining:
|
||||
return
|
||||
|
||||
token.apis[api] = RateLimit(remaining: remaining, reset: reset)
|
||||
|
||||
proc poolTokens*(amount: int) {.async.} =
|
||||
var futs: seq[Future[Token]]
|
||||
|
@ -92,13 +143,13 @@ proc poolTokens*(amount: int) {.async.} =
|
|||
try: newToken = await token
|
||||
except: discard
|
||||
|
||||
if newToken != nil:
|
||||
if not newToken.isNil:
|
||||
tokenPool.add newToken
|
||||
|
||||
proc initTokenPool*(cfg: Config) {.async.} =
|
||||
clientPool = HttpPool()
|
||||
|
||||
while true:
|
||||
if tokenPool.countIt(not it.isLimited) < cfg.minTokens:
|
||||
if tokenPool.countIt(not it.isLimited(Api.timeline)) < cfg.minTokens:
|
||||
await poolTokens(min(4, cfg.minTokens - tokenPool.len))
|
||||
await sleepAsync(2000)
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import times, sequtils, options, tables
|
||||
import prefs_impl
|
||||
|
||||
|
@ -5,13 +6,28 @@ genPrefsType()
|
|||
|
||||
type
|
||||
RateLimitError* = object of CatchableError
|
||||
InternalError* = object of CatchableError
|
||||
|
||||
Api* {.pure.} = enum
|
||||
userShow
|
||||
photoRail
|
||||
timeline
|
||||
search
|
||||
tweet
|
||||
list
|
||||
listBySlug
|
||||
listMembers
|
||||
|
||||
RateLimit* = object
|
||||
remaining*: int
|
||||
reset*: int
|
||||
|
||||
Token* = ref object
|
||||
tok*: string
|
||||
remaining*: int
|
||||
reset*: Time
|
||||
init*: Time
|
||||
lastUse*: Time
|
||||
pending*: int
|
||||
apis*: Table[Api, RateLimit]
|
||||
|
||||
Error* = enum
|
||||
null = 0
|
||||
|
@ -36,7 +52,7 @@ type
|
|||
location*: string
|
||||
website*: string
|
||||
bio*: string
|
||||
userpic*: string
|
||||
userPic*: string
|
||||
banner*: string
|
||||
following*: string
|
||||
followers*: string
|
||||
|
@ -46,7 +62,7 @@ type
|
|||
verified*: bool
|
||||
protected*: bool
|
||||
suspended*: bool
|
||||
joinDate*: Time
|
||||
joinDate*: DateTime
|
||||
|
||||
VideoType* = enum
|
||||
m3u8 = "application/x-mpegURL"
|
||||
|
@ -126,6 +142,8 @@ type
|
|||
videoDirectMessage = "video_direct_message"
|
||||
imageDirectMessage = "image_direct_message"
|
||||
audiospace = "audiospace"
|
||||
newsletterPublication = "newsletter_publication"
|
||||
unknown
|
||||
|
||||
Card* = object
|
||||
kind*: CardKind
|
||||
|
@ -150,7 +168,7 @@ type
|
|||
replyId*: int64
|
||||
profile*: Profile
|
||||
text*: string
|
||||
time*: Time
|
||||
time*: DateTime
|
||||
reply*: seq[string]
|
||||
pinned*: bool
|
||||
hasThread*: bool
|
||||
|
@ -212,6 +230,10 @@ type
|
|||
hmacKey*: string
|
||||
base64Media*: bool
|
||||
minTokens*: int
|
||||
enableRss*: bool
|
||||
enableDebug*: bool
|
||||
proxy*: string
|
||||
proxyAuth*: string
|
||||
|
||||
rssCacheTime*: int
|
||||
listCacheTime*: int
|
||||
|
@ -220,8 +242,7 @@ type
|
|||
redisPort*: int
|
||||
redisConns*: int
|
||||
redisMaxConns*: int
|
||||
|
||||
replaceYouTube*: string
|
||||
redisPassword*: string
|
||||
|
||||
Rss* = object
|
||||
feed*, cursor*: string
|
||||
|
|
|
@ -1,15 +1,15 @@
|
|||
import strutils, strformat, sequtils, uri, tables, base64
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, strformat, uri, tables, base64
|
||||
import nimcrypto, regex
|
||||
|
||||
var
|
||||
hmacKey {.threadvar.}: string
|
||||
hmacKey: string
|
||||
base64Media = false
|
||||
|
||||
const
|
||||
https* = "https://"
|
||||
twimg* = "pbs.twimg.com/"
|
||||
badJpgExts = @["1500x500", "jpgn", "jpg:", "jpg_", "_jpg"]
|
||||
badPngExts = @["pngn", "png:", "png_", "_png"]
|
||||
nitterParams = ["name", "tab", "id", "list", "referer", "scroll"]
|
||||
twitterDomains = @[
|
||||
"twitter.com",
|
||||
"pic.twitter.com",
|
||||
|
@ -42,17 +42,10 @@ proc getPicUrl*(link: string): string =
|
|||
else:
|
||||
&"/pic/{encodeUrl(link)}"
|
||||
|
||||
proc cleanFilename*(filename: string): string =
|
||||
const reg = re"[^A-Za-z0-9._-]"
|
||||
result = filename.replace(reg, "_")
|
||||
if badJpgExts.anyIt(it in result):
|
||||
result &= ".jpg"
|
||||
elif badPngExts.anyIt(it in result):
|
||||
result &= ".png"
|
||||
|
||||
proc filterParams*(params: Table): seq[(string, string)] =
|
||||
const filter = ["name", "id", "list", "referer", "scroll"]
|
||||
toSeq(params.pairs()).filterIt(it[0] notin filter and it[1].len > 0)
|
||||
for p in params.pairs():
|
||||
if p[1].len > 0 and p[0] notin nitterParams:
|
||||
result.add p
|
||||
|
||||
proc isTwitterUrl*(uri: Uri): bool =
|
||||
uri.hostname in twitterDomains
|
||||
|
|
|
@ -1,9 +1,12 @@
|
|||
import karax/[karaxdsl, vdom]
|
||||
import markdown
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strformat
|
||||
import karax/[karaxdsl, vdom], markdown
|
||||
|
||||
const
|
||||
hash = staticExec("git log -1 --format=\"%h\"")
|
||||
date = staticExec("git show -s --format=\"%cd\" --date=format:\"%Y.%m.%d\"")
|
||||
hash = staticExec("git show -s --format=\"%h\"")
|
||||
link = "https://github.com/zedeus/nitter/commit/" & hash
|
||||
version = &"{date}-{hash}"
|
||||
|
||||
let
|
||||
about = markdown(readFile("public/md/about.md"))
|
||||
|
@ -14,8 +17,8 @@ proc renderAbout*(): VNode =
|
|||
verbatim about
|
||||
h2: text "Instance info"
|
||||
p:
|
||||
text "Commit "
|
||||
a(href=link): text hash
|
||||
text "Version "
|
||||
a(href=link): text version
|
||||
|
||||
proc renderFeature*(): VNode =
|
||||
buildHtml(tdiv(class="overlay-panel")):
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import options
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import uri, strutils, strformat
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
|
@ -10,30 +11,31 @@ const
|
|||
doctype = "<!DOCTYPE html>\n"
|
||||
lp = readFile("public/lp.svg")
|
||||
|
||||
proc renderNavbar*(title, rss: string; req: Request): VNode =
|
||||
let twitterPath = getTwitterLink(req.path, req.params)
|
||||
var path = $(parseUri(req.path) ? filterParams(req.params))
|
||||
if "/status" in path: path.add "#m"
|
||||
proc renderNavbar(cfg: Config; req: Request; rss, canonical: string): VNode =
|
||||
var path = req.params.getOrDefault("referer")
|
||||
if path.len == 0:
|
||||
path = $(parseUri(req.path) ? filterParams(req.params))
|
||||
if "/status/" in path: path.add "#m"
|
||||
|
||||
buildHtml(nav):
|
||||
tdiv(class="inner-nav"):
|
||||
tdiv(class="nav-item"):
|
||||
a(class="site-name", href="/"): text title
|
||||
a(class="site-name", href="/"): text cfg.title
|
||||
|
||||
a(href="/"): img(class="site-logo", src="/logo.png")
|
||||
a(href="/"): img(class="site-logo", src="/logo.png", alt="Logo")
|
||||
|
||||
tdiv(class="nav-item right"):
|
||||
icon "search", title="Search", href="/search"
|
||||
if rss.len > 0:
|
||||
if cfg.enableRss and rss.len > 0:
|
||||
icon "rss-feed", title="RSS Feed", href=rss
|
||||
icon "bird", title="Open in Twitter", href=twitterPath
|
||||
icon "bird", title="Open in Twitter", href=canonical
|
||||
a(href="https://liberapay.com/zedeus"): verbatim lp
|
||||
icon "info", title="About", href="/about"
|
||||
iconReferer "cog", "/settings", path, title="Preferences"
|
||||
icon "cog", title="Preferences", href=("/settings?referer=" & encodeUrl(path))
|
||||
|
||||
proc renderHead*(prefs: Prefs; cfg: Config; titleText=""; desc=""; video="";
|
||||
images: seq[string] = @[]; banner=""; ogTitle=""; theme="";
|
||||
rss=""): VNode =
|
||||
rss=""; canonical=""): VNode =
|
||||
let ogType =
|
||||
if video.len > 0: "video"
|
||||
elif images.len > 0: "photo"
|
||||
|
@ -42,7 +44,7 @@ proc renderHead*(prefs: Prefs; cfg: Config; titleText=""; desc=""; video="";
|
|||
let opensearchUrl = getUrlPrefix(cfg) & "/opensearch"
|
||||
|
||||
buildHtml(head):
|
||||
link(rel="stylesheet", type="text/css", href="/css/style.css?v=3")
|
||||
link(rel="stylesheet", type="text/css", href="/css/style.css?v=9")
|
||||
link(rel="stylesheet", type="text/css", href="/css/fontello.css?v=2")
|
||||
|
||||
if theme.len > 0:
|
||||
|
@ -56,7 +58,10 @@ proc renderHead*(prefs: Prefs; cfg: Config; titleText=""; desc=""; video="";
|
|||
link(rel="search", type="application/opensearchdescription+xml", title=cfg.title,
|
||||
href=opensearchUrl)
|
||||
|
||||
if rss.len > 0:
|
||||
if canonical.len > 0:
|
||||
link(rel="canonical", href=canonical)
|
||||
|
||||
if cfg.enableRss and rss.len > 0:
|
||||
link(rel="alternate", type="application/rss+xml", href=rss, title="RSS feed")
|
||||
|
||||
if prefs.hlsPlayback:
|
||||
|
@ -73,6 +78,7 @@ proc renderHead*(prefs: Prefs; cfg: Config; titleText=""; desc=""; video="";
|
|||
text cfg.title
|
||||
|
||||
meta(name="viewport", content="width=device-width, initial-scale=1.0")
|
||||
meta(name="theme-color", content="#1F1F1F")
|
||||
meta(property="og:type", content=ogType)
|
||||
meta(property="og:title", content=(if ogTitle.len > 0: ogTitle else: titleText))
|
||||
meta(property="og:description", content=stripHtml(desc))
|
||||
|
@ -81,10 +87,11 @@ proc renderHead*(prefs: Prefs; cfg: Config; titleText=""; desc=""; video="";
|
|||
|
||||
if banner.len > 0:
|
||||
let bannerUrl = getPicUrl(banner)
|
||||
link(rel="preload", type="image/png", href=getPicUrl(banner), `as`="image")
|
||||
link(rel="preload", type="image/png", href=bannerUrl, `as`="image")
|
||||
|
||||
for url in images:
|
||||
let suffix = if "400x400" in url: "" else: "?name=small"
|
||||
let suffix = if "400x400" in url or url.endsWith("placeholder.png"): ""
|
||||
else: "?name=small"
|
||||
let preloadUrl = getPicUrl(url & suffix)
|
||||
link(rel="preload", type="image/png", href=preloadUrl, `as`="image")
|
||||
|
||||
|
@ -114,11 +121,14 @@ proc renderMain*(body: VNode; req: Request; cfg: Config; prefs=defaultPrefs;
|
|||
if "theme" in req.params:
|
||||
theme = toLowerAscii(req.params["theme"]).replace(" ", "_")
|
||||
|
||||
let canonical = getTwitterLink(req.path, req.params)
|
||||
|
||||
let node = buildHtml(html(lang="en")):
|
||||
renderHead(prefs, cfg, titleText, desc, video, images, banner, ogTitle, theme, rss)
|
||||
renderHead(prefs, cfg, titleText, desc, video, images, banner, ogTitle,
|
||||
theme, rss, canonical)
|
||||
|
||||
body:
|
||||
renderNavbar(cfg.title, rss, req)
|
||||
renderNavbar(cfg, req, rss, canonical)
|
||||
|
||||
tdiv(class="container"):
|
||||
body
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strformat
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
|
@ -24,5 +25,5 @@ proc renderList*(body: VNode; query: Query; list: List): VNode =
|
|||
tdiv(class="timeline-description"):
|
||||
text list.description
|
||||
|
||||
renderListTabs(query, &"/{list.username}/lists/{list.name}")
|
||||
renderListTabs(query, &"/i/lists/{list.id}")
|
||||
body
|
||||
|
|
|
@ -1,4 +1,5 @@
|
|||
#? stdtmpl(subsChar = '$', metaChar = '#')
|
||||
## SPDX-License-Identifier: AGPL-3.0-only
|
||||
#proc generateOpenSearchXML*(name, hostname, url: string): string =
|
||||
# result = ""
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import tables, macros, strutils
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
|
@ -39,10 +40,10 @@ proc renderPreferences*(prefs: Prefs; path: string; themes: seq[string]): VNode
|
|||
|
||||
renderPrefs()
|
||||
|
||||
h4(class="cookie-note"):
|
||||
h4(class="note"):
|
||||
text "Preferences are stored client-side using cookies without any personal information."
|
||||
|
||||
button(`type`="submit", class="pref-submit"):
|
||||
text "Save preferences"
|
||||
|
||||
buttonReferer "/resetprefs", "Reset Preferences", path, class="pref-reset"
|
||||
buttonReferer "/resetprefs", "Reset preferences", path, class="pref-reset"
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, strformat
|
||||
import karax/[karaxdsl, vdom, vstyles]
|
||||
|
||||
|
@ -14,12 +15,14 @@ proc renderStat(num, class: string; text=""): VNode =
|
|||
proc renderProfileCard*(profile: Profile; prefs: Prefs): VNode =
|
||||
buildHtml(tdiv(class="profile-card")):
|
||||
tdiv(class="profile-card-info"):
|
||||
let url = getPicUrl(profile.getUserPic())
|
||||
var size = "_400x400"
|
||||
if prefs.autoplayGifs and profile.userpic.endsWith("gif"):
|
||||
size = ""
|
||||
let
|
||||
url = getPicUrl(profile.getUserPic())
|
||||
size =
|
||||
if prefs.autoplayGifs and profile.userPic.endsWith("gif"): ""
|
||||
else: "_400x400"
|
||||
|
||||
a(class="profile-card-avatar", href=url, target="_blank"):
|
||||
genImg(profile.getUserpic(size))
|
||||
genImg(profile.getUserPic(size))
|
||||
|
||||
tdiv(class="profile-card-tabs-name"):
|
||||
linkUser(profile, class="profile-card-fullname")
|
||||
|
@ -29,7 +32,7 @@ proc renderProfileCard*(profile: Profile; prefs: Prefs): VNode =
|
|||
if profile.bio.len > 0:
|
||||
tdiv(class="profile-bio"):
|
||||
p(dir="auto"):
|
||||
verbatim replaceUrl(profile.bio, prefs)
|
||||
verbatim replaceUrls(profile.bio, prefs)
|
||||
|
||||
if profile.location.len > 0:
|
||||
tdiv(class="profile-location"):
|
||||
|
@ -45,7 +48,7 @@ proc renderProfileCard*(profile: Profile; prefs: Prefs): VNode =
|
|||
if profile.website.len > 0:
|
||||
tdiv(class="profile-website"):
|
||||
span:
|
||||
let url = replaceUrl(profile.website, prefs)
|
||||
let url = replaceUrls(profile.website, prefs)
|
||||
icon "link"
|
||||
a(href=url): text shortLink(url)
|
||||
|
||||
|
@ -102,8 +105,8 @@ proc renderProfile*(profile: Profile; timeline: var Timeline;
|
|||
tdiv(class="profile-banner"):
|
||||
renderBanner(profile)
|
||||
|
||||
let sticky = if prefs.stickyProfile: "sticky" else: "unset"
|
||||
tdiv(class="profile-tab", style={position: sticky}):
|
||||
let sticky = if prefs.stickyProfile: " sticky" else: ""
|
||||
tdiv(class=(&"profile-tab{sticky}")):
|
||||
renderProfileCard(profile, prefs)
|
||||
if photoRail.len > 0:
|
||||
renderPhotoRail(profile, photoRail)
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils
|
||||
import karax/[karaxdsl, vdom, vstyles]
|
||||
import ".."/[types, utils]
|
||||
|
@ -30,7 +31,7 @@ proc linkUser*(profile: Profile, class=""): VNode =
|
|||
icon "lock", title="Protected account"
|
||||
|
||||
proc linkText*(text: string; class=""): VNode =
|
||||
let url = if "http" notin text: "http://" & text else: text
|
||||
let url = if "http" notin text: https & text else: text
|
||||
buildHtml():
|
||||
a(href=url, class=class): text text
|
||||
|
||||
|
@ -41,12 +42,6 @@ proc hiddenField*(name, value: string): VNode =
|
|||
proc refererField*(path: string): VNode =
|
||||
hiddenField("referer", path)
|
||||
|
||||
proc iconReferer*(icon, action, path: string, title=""): VNode =
|
||||
buildHtml(form(`method`="get", action=action, class="icon-button")):
|
||||
refererField path
|
||||
button(`type`="submit"):
|
||||
icon icon, title=title
|
||||
|
||||
proc buttonReferer*(action, text, path: string; class=""; `method`="post"): VNode =
|
||||
buildHtml(form(`method`=`method`, action=action, class=class)):
|
||||
refererField path
|
||||
|
|
|
@ -1,13 +1,18 @@
|
|||
#? stdtmpl(subsChar = '$', metaChad = '#')
|
||||
#import strutils, xmltree, strformat, options
|
||||
#import ../types, ../utils, ../formatters
|
||||
#? stdtmpl(subsChar = '$', metaChar = '#')
|
||||
## SPDX-License-Identifier: AGPL-3.0-only
|
||||
#import strutils, xmltree, strformat, options, unicode
|
||||
#import ../types, ../utils, ../formatters, ../prefs
|
||||
#
|
||||
#proc getTitle(tweet: Tweet; prefs: Prefs; retweet: string): string =
|
||||
#proc getTitle(tweet: Tweet; retweet: string): string =
|
||||
#if tweet.pinned: result = "Pinned: "
|
||||
#elif retweet.len > 0: result = &"RT by @{retweet}: "
|
||||
#elif tweet.reply.len > 0: result = &"R to @{tweet.reply[0]}: "
|
||||
#end if
|
||||
#result &= xmltree.escape(stripHtml(tweet.text))
|
||||
#var text = stripHtml(tweet.text)
|
||||
##if unicode.runeLen(text) > 32:
|
||||
## text = unicode.runeSubStr(text, 0, 32) & "..."
|
||||
##end if
|
||||
#result &= xmltree.escape(text)
|
||||
#if result.len > 0: return
|
||||
#end if
|
||||
#if tweet.photos.len > 0:
|
||||
|
@ -23,15 +28,14 @@
|
|||
Twitter feed for: ${desc}. Generated by ${cfg.hostname}
|
||||
#end proc
|
||||
#
|
||||
#proc renderRssTweet(tweet: Tweet; prefs: Prefs; cfg: Config): string =
|
||||
#proc renderRssTweet(tweet: Tweet; cfg: Config): string =
|
||||
#let tweet = tweet.retweet.get(tweet)
|
||||
#let urlPrefix = getUrlPrefix(cfg)
|
||||
#let text = replaceUrl(tweet.text, prefs, absolute=urlPrefix)
|
||||
#let text = replaceUrls(tweet.text, defaultPrefs, absolute=urlPrefix)
|
||||
<p>${text.replace("\n", "<br>\n")}</p>
|
||||
#if tweet.quote.isSome and get(tweet.quote).available:
|
||||
# let quoteLink = getLink(get(tweet.quote))
|
||||
<p>${text}<br><a href="${urlPrefix}${quoteLink}">${cfg.hostname}${quoteLink}</a></p>
|
||||
#else:
|
||||
<p>${text}</p>
|
||||
<p><a href="${urlPrefix}${quoteLink}">${cfg.hostname}${quoteLink}</a></p>
|
||||
#end if
|
||||
#if tweet.photos.len > 0:
|
||||
# for photo in tweet.photos:
|
||||
|
@ -44,10 +48,15 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
|
|||
# let url = &"{urlPrefix}{getPicUrl(get(tweet.gif).url)}"
|
||||
<video poster="${thumb}" autoplay muted loop style="max-width:250px;">
|
||||
<source src="${url}" type="video/mp4"</source></video>
|
||||
#elif tweet.card.isSome:
|
||||
# let card = tweet.card.get()
|
||||
# if card.image.len > 0:
|
||||
<img src="${urlPrefix}${getPicUrl(card.image)}" style="max-width:250px;" />
|
||||
# end if
|
||||
#end if
|
||||
#end proc
|
||||
#
|
||||
#proc renderRssTweets(tweets: seq[Tweet]; prefs: Prefs; cfg: Config): string =
|
||||
#proc renderRssTweets(tweets: seq[Tweet]; cfg: Config): string =
|
||||
#let urlPrefix = getUrlPrefix(cfg)
|
||||
#var links: seq[string]
|
||||
#for t in tweets:
|
||||
|
@ -58,9 +67,9 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
|
|||
# end if
|
||||
# links.add link
|
||||
<item>
|
||||
<title>${getTitle(tweet, prefs, retweet)}</title>
|
||||
<title>${getTitle(tweet, retweet)}</title>
|
||||
<dc:creator>@${tweet.profile.username}</dc:creator>
|
||||
<description><![CDATA[${renderRssTweet(tweet, prefs, cfg).strip(chars={'\n'})}]]></description>
|
||||
<description><![CDATA[${renderRssTweet(tweet, cfg).strip(chars={'\n'})}]]></description>
|
||||
<pubDate>${getRfc822Time(tweet)}</pubDate>
|
||||
<guid>${urlPrefix & link}</guid>
|
||||
<link>${urlPrefix & link}</link>
|
||||
|
@ -69,7 +78,6 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
|
|||
#end proc
|
||||
#
|
||||
#proc renderTimelineRss*(timeline: Timeline; profile: Profile; cfg: Config; multi=false): string =
|
||||
#let prefs = Prefs(replaceTwitter: cfg.hostname, replaceYouTube: cfg.replaceYouTube)
|
||||
#let urlPrefix = getUrlPrefix(cfg)
|
||||
#result = ""
|
||||
#let user = (if multi: "" else: "@") & profile.username
|
||||
|
@ -94,15 +102,14 @@ Twitter feed for: ${desc}. Generated by ${cfg.hostname}
|
|||
<height>128</height>
|
||||
</image>
|
||||
#if timeline.content.len > 0:
|
||||
${renderRssTweets(timeline.content, prefs, cfg)}
|
||||
${renderRssTweets(timeline.content, cfg)}
|
||||
#end if
|
||||
</channel>
|
||||
</rss>
|
||||
#end proc
|
||||
#
|
||||
#proc renderListRss*(tweets: seq[Tweet]; list: List; cfg: Config): string =
|
||||
#let prefs = Prefs(replaceTwitter: cfg.hostname, replaceYouTube: cfg.replaceYouTube)
|
||||
#let link = &"{getUrlPrefix(cfg)}/{list.username}/lists/{list.name}"
|
||||
#let link = &"{getUrlPrefix(cfg)}/i/lists/{list.id}"
|
||||
#result = ""
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<rss xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
|
||||
|
@ -113,13 +120,12 @@ ${renderRssTweets(timeline.content, prefs, cfg)}
|
|||
<description>${getDescription(list.name & " by @" & list.username, cfg)}</description>
|
||||
<language>en-us</language>
|
||||
<ttl>40</ttl>
|
||||
${renderRssTweets(tweets, prefs, cfg)}
|
||||
${renderRssTweets(tweets, cfg)}
|
||||
</channel>
|
||||
</rss>
|
||||
#end proc
|
||||
#
|
||||
#proc renderSearchRss*(tweets: seq[Tweet]; name, param: string; cfg: Config): string =
|
||||
#let prefs = Prefs(replaceTwitter: cfg.hostname, replaceYouTube: cfg.replaceYouTube)
|
||||
#let link = &"{getUrlPrefix(cfg)}/search"
|
||||
#let escName = xmltree.escape(name)
|
||||
#result = ""
|
||||
|
@ -132,13 +138,12 @@ ${renderRssTweets(tweets, prefs, cfg)}
|
|||
<description>${getDescription("Search \"" & escName & "\"", cfg)}</description>
|
||||
<language>en-us</language>
|
||||
<ttl>40</ttl>
|
||||
${renderRssTweets(tweets, prefs, cfg)}
|
||||
${renderRssTweets(tweets, cfg)}
|
||||
</channel>
|
||||
</rss>
|
||||
#end proc
|
||||
#
|
||||
#proc renderThreadRss*(tweets: seq[Tweet]; username, tweetId: string; cfg: Config): string =
|
||||
#let prefs = Prefs(replaceTwitter: cfg.hostname, replaceYouTube: cfg.replaceYouTube)
|
||||
#let urlPrefix = getUrlPrefix(cfg)
|
||||
#result = ""
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
|
@ -150,7 +155,7 @@ ${renderRssTweets(tweets, prefs, cfg)}
|
|||
<description>${getDescription("Thread for " & urlPrefix & "/" & username & "/status/" & tweetId, cfg)}</description>
|
||||
<language>en-us</language>
|
||||
<ttl>40</ttl>
|
||||
${renderRssTweets(tweets, prefs, cfg)}
|
||||
${renderRssTweets(tweets, cfg)}
|
||||
</channel>
|
||||
</rss>
|
||||
#end proc
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, strformat, sequtils, unicode, tables
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
|
@ -24,7 +25,7 @@ proc renderSearch*(): VNode =
|
|||
tdiv(class="search-bar"):
|
||||
form(`method`="get", action="/search"):
|
||||
hiddenField("f", "users")
|
||||
input(`type`="text", name="q", autofocus="", placeholder="Enter username...")
|
||||
input(`type`="text", name="q", autofocus="", placeholder="Enter username...", dir="auto")
|
||||
button(`type`="submit"): icon "search"
|
||||
|
||||
proc renderProfileTabs*(query: Query; username: string): VNode =
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
import uri
|
||||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
import ".."/[types, formatters]
|
||||
|
@ -37,7 +37,7 @@ proc renderReplies*(replies: Result[Chain]; prefs: Prefs; path: string): VNode =
|
|||
renderReplyThread(thread, prefs, path)
|
||||
|
||||
if replies.bottom.len > 0:
|
||||
renderMore(Query(), encodeUrl(replies.bottom), focus="#r")
|
||||
renderMore(Query(), replies.bottom, focus="#r")
|
||||
|
||||
proc renderConversation*(conv: Conversation; prefs: Prefs; path: string): VNode =
|
||||
let hasAfter = conv.after.content.len > 0
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, strformat, sequtils, algorithm, uri, options
|
||||
import karax/[karaxdsl, vdom]
|
||||
|
||||
|
@ -25,7 +26,7 @@ proc renderNewer*(query: Query; path: string; focus=""): VNode =
|
|||
|
||||
proc renderMore*(query: Query; cursor: string; focus=""): VNode =
|
||||
buildHtml(tdiv(class="show-more")):
|
||||
a(href=(&"?{getQuery(query)}cursor={encodeUrl(cursor)}{focus}")):
|
||||
a(href=(&"?{getQuery(query)}cursor={encodeUrl(cursor, usePlus=false)}{focus}")):
|
||||
text "Load more"
|
||||
|
||||
proc renderNoMore(): VNode =
|
||||
|
@ -62,7 +63,7 @@ proc renderUser(user: Profile; prefs: Prefs): VNode =
|
|||
tdiv(class="tweet-body profile-result"):
|
||||
tdiv(class="tweet-header"):
|
||||
a(class="tweet-avatar", href=("/" & user.username)):
|
||||
genImg(user.getUserpic("_bigger"), class="avatar")
|
||||
genImg(user.getUserPic("_bigger"), class="avatar")
|
||||
|
||||
tdiv(class="tweet-name-row"):
|
||||
tdiv(class="fullname-and-username"):
|
||||
|
@ -70,7 +71,7 @@ proc renderUser(user: Profile; prefs: Prefs): VNode =
|
|||
linkUser(user, class="username")
|
||||
|
||||
tdiv(class="tweet-content media-body", dir="auto"):
|
||||
verbatim replaceUrl(user.bio, prefs)
|
||||
verbatim replaceUrls(user.bio, prefs)
|
||||
|
||||
proc renderTimelineUsers*(results: Result[Profile]; prefs: Prefs; path=""): VNode =
|
||||
buildHtml(tdiv(class="timeline")):
|
||||
|
|
|
@ -1,3 +1,4 @@
|
|||
# SPDX-License-Identifier: AGPL-3.0-only
|
||||
import strutils, sequtils, strformat, options
|
||||
import karax/[karaxdsl, vdom, vstyles]
|
||||
|
||||
|
@ -6,12 +7,12 @@ import ".."/[types, utils, formatters]
|
|||
|
||||
proc getSmallPic(url: string): string =
|
||||
result = url
|
||||
if "?" notin url:
|
||||
if "?" notin url and not url.endsWith("placeholder.png"):
|
||||
result &= ":small"
|
||||
result = getPicUrl(result)
|
||||
|
||||
proc renderMiniAvatar(profile: Profile): VNode =
|
||||
let url = getPicUrl(profile.getUserpic("_mini"))
|
||||
let url = getPicUrl(profile.getUserPic("_mini"))
|
||||
buildHtml():
|
||||
img(class="avatar mini", src=url)
|
||||
|
||||
|
@ -28,9 +29,9 @@ proc renderHeader(tweet: Tweet; retweet: string; prefs: Prefs): VNode =
|
|||
tdiv(class="tweet-header"):
|
||||
a(class="tweet-avatar", href=("/" & tweet.profile.username)):
|
||||
var size = "_bigger"
|
||||
if not prefs.autoplayGifs and tweet.profile.userpic.endsWith("gif"):
|
||||
if not prefs.autoplayGifs and tweet.profile.userPic.endsWith("gif"):
|
||||
size = "_400x400"
|
||||
genImg(tweet.profile.getUserpic(size), class="avatar")
|
||||
genImg(tweet.profile.getUserPic(size), class="avatar")
|
||||
|
||||
tdiv(class="tweet-name-row"):
|
||||
tdiv(class="fullname-and-username"):
|
||||
|
@ -65,36 +66,35 @@ proc isPlaybackEnabled(prefs: Prefs; video: Video): bool =
|
|||
of m3u8, vmap: prefs.hlsPlayback
|
||||
|
||||
proc renderVideoDisabled(video: Video; path: string): VNode =
|
||||
buildHtml(tdiv):
|
||||
img(src=getSmallPic(video.thumb))
|
||||
tdiv(class="video-overlay"):
|
||||
case video.playbackType
|
||||
of mp4:
|
||||
p: text "mp4 playback disabled in preferences"
|
||||
of m3u8, vmap:
|
||||
buttonReferer "/enablehls", "Enable hls playback", path
|
||||
buildHtml(tdiv(class="video-overlay")):
|
||||
case video.playbackType
|
||||
of mp4:
|
||||
p: text "mp4 playback disabled in preferences"
|
||||
of m3u8, vmap:
|
||||
buttonReferer "/enablehls", "Enable hls playback", path
|
||||
|
||||
proc renderVideoUnavailable(video: Video): VNode =
|
||||
buildHtml(tdiv):
|
||||
img(src=getSmallPic(video.thumb))
|
||||
tdiv(class="video-overlay"):
|
||||
case video.reason
|
||||
of "dmcaed":
|
||||
p: text "This media has been disabled in response to a report by the copyright owner"
|
||||
else:
|
||||
p: text "This media is unavailable"
|
||||
buildHtml(tdiv(class="video-overlay")):
|
||||
case video.reason
|
||||
of "dmcaed":
|
||||
p: text "This media has been disabled in response to a report by the copyright owner"
|
||||
else:
|
||||
p: text "This media is unavailable"
|
||||
|
||||
proc renderVideo*(video: Video; prefs: Prefs; path: string): VNode =
|
||||
let container =
|
||||
if video.description.len > 0 or video.title.len > 0: " card-container"
|
||||
else: ""
|
||||
|
||||
buildHtml(tdiv(class="attachments card")):
|
||||
tdiv(class="gallery-video" & container):
|
||||
tdiv(class="attachment video-container"):
|
||||
let thumb = getSmallPic(video.thumb)
|
||||
if not video.available:
|
||||
img(src=thumb)
|
||||
renderVideoUnavailable(video)
|
||||
elif not prefs.isPlaybackEnabled(video):
|
||||
img(src=thumb)
|
||||
renderVideoDisabled(video, path)
|
||||
else:
|
||||
let vid = video.variants.filterIt(it.videoType == video.playbackType)
|
||||
|
@ -166,7 +166,7 @@ proc renderCardContent(card: Card): VNode =
|
|||
proc renderCard(card: Card; prefs: Prefs; path: string): VNode =
|
||||
const smallCards = {app, player, summary, storeLink}
|
||||
let large = if card.kind notin smallCards: " large" else: ""
|
||||
let url = replaceUrl(card.url, prefs)
|
||||
let url = replaceUrls(card.url, prefs)
|
||||
|
||||
buildHtml(tdiv(class=("card" & large))):
|
||||
if card.video.isSome:
|
||||
|
@ -181,12 +181,16 @@ proc renderCard(card: Card; prefs: Prefs; path: string): VNode =
|
|||
tdiv(class="card-content-container"):
|
||||
renderCardContent(card)
|
||||
|
||||
func formatStat(stat: int): string =
|
||||
if stat > 0: insertSep($stat, ',')
|
||||
else: ""
|
||||
|
||||
proc renderStats(stats: TweetStats; views: string): VNode =
|
||||
buildHtml(tdiv(class="tweet-stats")):
|
||||
span(class="tweet-stat"): icon "comment", insertSep($stats.replies, ',')
|
||||
span(class="tweet-stat"): icon "retweet", insertSep($stats.retweets, ',')
|
||||
span(class="tweet-stat"): icon "quote", insertSep($stats.quotes, ',')
|
||||
span(class="tweet-stat"): icon "heart", insertSep($stats.likes, ',')
|
||||
span(class="tweet-stat"): icon "comment", formatStat(stats.replies)
|
||||
span(class="tweet-stat"): icon "retweet", formatStat(stats.retweets)
|
||||
span(class="tweet-stat"): icon "quote", formatStat(stats.quotes)
|
||||
span(class="tweet-stat"): icon "heart", formatStat(stats.likes)
|
||||
if views.len > 0:
|
||||
span(class="tweet-stat"): icon "play", insertSep(views, ',')
|
||||
|
||||
|
@ -201,6 +205,8 @@ proc renderAttribution(profile: Profile): VNode =
|
|||
buildHtml(a(class="attribution", href=("/" & profile.username))):
|
||||
renderMiniAvatar(profile)
|
||||
strong: text profile.fullname
|
||||
if profile.verified:
|
||||
icon "ok", class="verified-icon", title="Verified account"
|
||||
|
||||
proc renderMediaTags(tags: seq[Profile]): VNode =
|
||||
buildHtml(tdiv(class="media-tag-block")):
|
||||
|
@ -226,6 +232,8 @@ proc renderQuote(quote: Tweet; prefs: Prefs; path: string): VNode =
|
|||
tdiv(class="unavailable-quote"):
|
||||
if quote.tombstone.len > 0:
|
||||
text quote.tombstone
|
||||
elif quote.text.len > 0:
|
||||
text quote.text
|
||||
else:
|
||||
text "This tweet is unavailable"
|
||||
|
||||
|
@ -247,7 +255,7 @@ proc renderQuote(quote: Tweet; prefs: Prefs; path: string): VNode =
|
|||
|
||||
if quote.text.len > 0:
|
||||
tdiv(class="quote-text", dir="auto"):
|
||||
verbatim replaceUrl(quote.text, prefs)
|
||||
verbatim replaceUrls(quote.text, prefs)
|
||||
|
||||
if quote.hasThread:
|
||||
a(class="show-thread", href=getLink(quote)):
|
||||
|
@ -278,9 +286,14 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
|
|||
tdiv(class="unavailable-box"):
|
||||
if tweet.tombstone.len > 0:
|
||||
text tweet.tombstone
|
||||
elif tweet.text.len > 0:
|
||||
text tweet.text
|
||||
else:
|
||||
text "This tweet is unavailable"
|
||||
|
||||
if tweet.quote.isSome:
|
||||
renderQuote(tweet.quote.get(), prefs, path)
|
||||
|
||||
let fullTweet = tweet
|
||||
var retweet: string
|
||||
var tweet = fullTweet
|
||||
|
@ -305,7 +318,7 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
|
|||
tweetClass &= " tweet-bidi"
|
||||
|
||||
tdiv(class=tweetClass, dir="auto"):
|
||||
verbatim replaceUrl(tweet.text, prefs) & renderLocation(tweet)
|
||||
verbatim replaceUrls(tweet.text, prefs) & renderLocation(tweet)
|
||||
|
||||
if tweet.attribution.isSome:
|
||||
renderAttribution(tweet.attribution.get())
|
||||
|
@ -329,7 +342,7 @@ proc renderTweet*(tweet: Tweet; prefs: Prefs; path: string; class=""; index=0;
|
|||
renderQuote(tweet.quote.get(), prefs, path)
|
||||
|
||||
if mainTweet:
|
||||
p(class="tweet-published"): text getTweetTime(tweet)
|
||||
p(class="tweet-published"): text getTime(tweet)
|
||||
|
||||
if tweet.mediaTags.len > 0:
|
||||
renderMediaTags(tweet.mediaTags)
|
||||
|
|
|
@ -5,7 +5,7 @@ from parameterized import parameterized
|
|||
card = [
|
||||
['Thom_Wolf/status/1122466524860702729',
|
||||
'pytorch/fairseq',
|
||||
'Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - pytorch/fairseq',
|
||||
'Facebook AI Research Sequence-to-Sequence Toolkit written in Python. - GitHub - pytorch/fairseq: Facebook AI Research Sequence-to-Sequence Toolkit written in Python.',
|
||||
'github.com', True],
|
||||
|
||||
['nim_lang/status/1136652293510717440',
|
||||
|
@ -48,12 +48,7 @@ no_thumb = [
|
|||
['nim_lang/status/1082989146040340480',
|
||||
'Nim in 2018: A short recap',
|
||||
'Posted in r/programming by u/miran1',
|
||||
'reddit.com'],
|
||||
|
||||
['lorenlugosch/status/1115440394148487168',
|
||||
'Fluent Speech Commands: A dataset for spoken language understanding research',
|
||||
'In recent years, with the advent of deep neural networks, the accuracy of speech recognition models have been notably improved which have made possible the production of speech-to-text systems that...',
|
||||
'fluent.ai']
|
||||
'reddit.com']
|
||||
]
|
||||
|
||||
playable = [
|
||||
|
|
|
@ -16,7 +16,7 @@ timeline = [
|
|||
]
|
||||
|
||||
status = [
|
||||
[20, 'jack', 'jack', '21 Mar 2006', 'just setting up my twttr'],
|
||||
[20, 'jack⚡️', 'jack', '21 Mar 2006', 'just setting up my twttr'],
|
||||
[134849778302464000, 'The Twoffice', 'TheTwoffice', '11 Nov 2011', 'test'],
|
||||
[105685475985080322, 'The Twoffice', 'TheTwoffice', '22 Aug 2011', 'regular tweet'],
|
||||
[572593440719912960, 'Test account', 'mobile_test', '3 Mar 2015', 'testing test']
|
||||
|
@ -42,7 +42,7 @@ link = [
|
|||
['nim_lang/status/1110499584852353024', [
|
||||
'nim-lang.org/araq/ownedrefs.…',
|
||||
'news.ycombinator.com/item?id…',
|
||||
'old.reddit.com/r/programming…'
|
||||
'teddit.net/r/programming…'
|
||||
]],
|
||||
['nim_lang/status/1125887775151140864', [
|
||||
'en.wikipedia.org/wiki/Nim_(p…'
|
||||
|
@ -71,7 +71,7 @@ emoji = [
|
|||
|
||||
retweet = [
|
||||
[7, 'mobile_test_2', 'mobile test 2', 'Test account', '@mobile_test', '1234'],
|
||||
[3, 'mobile_test_8', 'mobile test 8', 'jack', '@jack', 'twttr']
|
||||
[3, 'mobile_test_8', 'mobile test 8', 'jack⚡️', '@jack', 'twttr']
|
||||
]
|
||||
|
||||
reply = [
|
||||
|
|
|
@ -1,5 +1,6 @@
|
|||
from base import BaseTestCase, Poll, Media
|
||||
from parameterized import parameterized
|
||||
from selenium.webdriver.common.by import By
|
||||
|
||||
poll = [
|
||||
['nim_lang/status/1064219801499955200', 'Style insensitivity', '91', 1, [
|
||||
|
@ -53,8 +54,8 @@ class MediaTest(BaseTestCase):
|
|||
poll_choices = self.find_elements(Poll.choice)
|
||||
for i, (v, o) in enumerate(choices):
|
||||
choice = poll_choices[i]
|
||||
value = choice.find_element_by_class_name(Poll.value)
|
||||
option = choice.find_element_by_class_name(Poll.option)
|
||||
value = choice.find_element(By.CLASS_NAME, Poll.value)
|
||||
option = choice.find_element(By.CLASS_NAME, Poll.option)
|
||||
choice_class = choice.get_attribute('class')
|
||||
|
||||
self.assert_equal(v, value.text)
|
||||
|
@ -106,7 +107,7 @@ class MediaTest(BaseTestCase):
|
|||
self.assert_equal(len(rows), len(gallery_rows))
|
||||
|
||||
for i, row in enumerate(gallery_rows):
|
||||
images = row.find_elements_by_css_selector('img')
|
||||
images = row.find_elements(By.CSS_SELECTOR, 'img')
|
||||
self.assert_equal(len(rows[i]), len(images))
|
||||
for j, image in enumerate(images):
|
||||
self.assertIn(rows[i][j], image.get_attribute('src'))
|
||||
|
|
Loading…
Reference in New Issue