Code Monkey home page Code Monkey logo

curl-impersonate's People

Contributors

alkarex avatar bjia56 avatar djoldman avatar izzues avatar jwilk avatar lilyinstarlight avatar lwthiker avatar matheusfillipe avatar nicoandmee avatar peterupfold avatar rizialdi avatar weebdatahoarder avatar wrobelda avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

curl-impersonate's Issues

Any problematic flags?

Hi, thanks for this nice project!

I was looking to use the program with some extra curl flags such as --cookie/--ipv4 and wondered if any of these could cause the default fingerprint to change. I heard that even the order of the headers sent could affect the fingerprint so I am wondering if there are any issues using additional flags such as the above. Are there any known "problematic" flags which could possibly affect the default fingerprint?

Does it work when using proxy?

If I use curl --proxy and send it to http proxy, will it properly forward everything to destination server as intended?

How to cleanly uninstall curl-impersonate?

Hi,

I'm not gonna lie, I don't know much about how this project works, and some of it's concept are way beyond me, but I wanted to give it an eye for scraping something.

The thing is, now I want to uninstall this project.

So I deleted the folder where I cloned the repo and remove some stuff installed with apt, but I still have acces to the command curl_chrome99.

Is there a procedure to how to cleanly uninstall curl-impersonate?

Thanks by advance ๐Ÿ˜€

passing JSON data yields error: bad/illegal format

Issue

The following command in curl works:

curl -X POST -H "Content-Type: application/json" -d '{"email": "[email protected]", "password": "password", "detail": "Whats_This"}' https://dangerous.url/cgi-bin/sharept-nextz.php

The same command with curl_ff91esr does not:

curl_ff91esr -X POST -H "Content-Type: application/json" -d '{"email": "[email protected]", "password": "password", "detail": "Whats_This"}' https://dangerous.url/cgi-bin/sharept-nextz.php

Issuing the above command results in the following error:

curl: (6) Could not resolve host: application
curl: (6) Could not resolve host: work.com",
curl: (3) URL using bad/illegal format or missing URL
curl: (6) Could not resolve host: "password",
curl: (3) URL using bad/illegal format or missing URL
curl: (3) unmatched close brace/bracket in URL position 13:
"Whats_This"}

I get a similar error if I use curl_ff95. Seems like it isn't parsing the HTTP POST data string correctly. I cannot, unfortunately, test this using the Chrome version since it fails to compile on my system.

Additional Details

I am using the curl-impersonate AUR package. The out from curl_ff91esr -V is:

curl 7.81.0 (x86_64-pc-linux-gnu) libcurl/7.81.0 NSS/3.74 zlib/1.2.12 brotli/1.0.9 zstd/1.5.2 libidn2/2.3.2 libpsl/0.21.1 (+libidn2/2.3.0) nghttp2/1.46.0 librtmp/2.3 OpenLDAP/2.6.1
Release-Date: 2022-01-05
Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps mqtt pop3 pop3s rtmp rtsp smb smbs smtp smtps telnet tftp 
Features: alt-svc AsynchDNS brotli HSTS HTTP2 HTTPS-proxy IDN IPv6 Largefile libz NTLM NTLM_WB PSL SSL UnixSockets zstd

Please note that the URL is fictitious. I don't want to share the URL publically since it is associated with a phishing campaign.

chrome104 and curl_easy_impersonate

Hello,
unable to set chrome104 target in curl_easy_impersonate function call.

libcurl-impersonate-chrome.so version 0.5.2

Return error curl: A libcurl function was given a bad argument

Does this library support chrome104?

Thankyou

"curl-impersonate-chrome: No such file or directory" (but the file exists)

Hey,

I downloaded the binaries from the Releases page. It works on Debian, but on NixOS I'm getting the following error:

$ ./curl_chrome99
# [...]/curl_chrome99: line 10: [...]/curl-impersonate-chrome: No such file or directory

When I run ldd [...]/curl-impersonate-chrome, this is what I see:

    linux-vdso.so.1 (0x00007ffe85ba6000)
    libz.so.1 => not found
    libpthread.so.0 => /nix/store/f6kvkdzp6qfjm6h94d0jgfvm4r06xcaq-glibc-2.34-210/lib/libpthread.so.0 (0x00007f03352ed000)
    libc.so.6 => /nix/store/f6kvkdzp6qfjm6h94d0jgfvm4r06xcaq-glibc-2.34-210/lib/libc.so.6 (0x00007f03350ef000)
    /lib64/ld-linux-x86-64.so.2 => /nix/store/f6kvkdzp6qfjm6h94d0jgfvm4r06xcaq-glibc-2.34-210/lib64/ld-linux-x86-64.so.2 (0x00007f0335606000)

Unsure if related: I see a test is performed in the build script, but it's not checking for zlib? In addition, I don't think it works as expected (if I'm not mistaken, grep -q returns success if any of the patterns is found):

RUN ! (ldd ./out/curl-impersonate | grep -q -e libcurl -e nghttp2 -e brotli -e ssl -e crypto)

curl: /usr/local/lib/libcurl.so.4: no version information available (required by curl)

When using libcurl-impersonate.so (not sure which one to use, there is .so, .so.4, .so.4.7.0) compiled and replaced, curl will show an error with version not available. I'm not sure if I did something wrong ? I compiled it using the docker instructions.

curl --version
curl: /usr/local/lib/libcurl.so.4: no version information available (required by curl)
curl 7.68.0 (x86_64-pc-linux-gnu) libcurl/7.81.0 BoringSSL zlib/1.2.11 brotli/1.0.9 nghttp2/1.46.0
Release-Date: 2020-01-08
Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp
Features: alt-svc AsynchDNS brotli HTTP2 HTTPS-proxy IPv6 Largefile libz NTLM NTLM_WB SSL UnixSockets
WARNING: curl and libcurl versions do not match. Functionality may be affected.
ldd /usr/bin/curl
/usr/bin/curl: /usr/local/lib/libcurl.so.4: no version information available (required by /usr/bin/curl)
        linux-vdso.so.1 (0x00007ffc4bf3c000)
        libcurl.so.4 => /usr/local/lib/libcurl.so.4 (0x00007f68424ca000)
        libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f68424a4000)
        libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f6842481000)
        libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f684228f000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f68427c3000)

Fedora packaging?

Hey there,

This project is super useful, however manually updating binaries/building from source can be tedious. Would it be possible if this could be packaged for Fedora, perhaps utilising Copr (effectively Fedora user repos)?

(while im here, it might be nice to replace yum with dnf in the readme :3)

Thanks,
Elliott

Non-dockerized build script

We should have a build script that builds curl-impersonate and its dependencies on the local system, and not within a container. This should allow compiling curl-impersonate for a broader range of platforms, including Mac OS, other Linuxes, etc.

Once there's a build script it could be used in the GitHub Actions workflow to automatically publish compiled binaries for multiple platforms.

Impersonate Safari on Mac

Safari being the second most used desktop browser according to some websites, it can be a good candidate for impersonation as well. I don't have access to a Mac right now.. Is anyone willing to share a Wireshark capture of a TLS session from Safari ? Bonus if it's HTTP/2 and if it can be decrypted as well (you can set up a local nginx with a self signed key for this).

Musl .so version

Is it possible to get a musl version?

/etc/supervisor/conf.d # LD_PRELOAD='/var/www/resources/binary/curl/libcurl-impersonate-chrome.so' curl --version
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __fdelt_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __memcpy_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __vsnprintf_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __strcpy_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __memset_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __fprintf_chk: symbol not found
Error relocating /var/www/resources/binary/curl/libcurl-impersonate-chrome.so: __sprintf_chk: symbol not found
/etc/supervisor/conf.d # curl --version
curl 7.83.1 (x86_64-alpine-linux-musl) libcurl/7.83.1 OpenSSL/1.1.1q zlib/1.2.12 brotli/1.0.9 nghttp2/1.47.0
Release-Date: 2022-05-11
Protocols: dict file ftp ftps gopher gophers http https imap imaps mqtt pop3 pop3s rtsp smb smbs smtp smtps telnet tftp 
Features: alt-svc AsynchDNS brotli HSTS HTTP2 HTTPS-proxy IPv6 Largefile libz NTLM NTLM_WB SSL TLS-SRP UnixSockets

Dockers hangs on multiple requests

Awesome project, congrats!

The only problem that I found its that the docker hangs when you try to do simultaneous requests.

Do you know how can fix this?

I am trying to execute 100 times curl_chrome99 at the same time.

Does this project provide a solution for connecting to a web socket?

Does this project provide a solution for connecting to a web socket?
For example, I need an example like this in nodejs:

var socket = new WebSocket(url);
socket.onopen = function() {
...
};
socket.onmessage = function(e) {
...
};

The url in that example is on cloudflare.

No support for brotli on content-encoding

As part of the headers sent to the server, accept-encoding: gzip, deflate, br is included.

However, this curl only supports deflate, gzip, as such a server returning content-encoding: br will fail to decompress with a --compressed argument.

Add mobile browsers

Hello,

Thank you for your incredible work.

Is it possible to add mobile browsers? And probably others, like linux, chrome on mac.

Thank you

./generate_dockerfiles.sh Shooot, could not parse view as JSON.

When I run ./generate_dockerfiles.sh command, I get this error:

# ./generate_dockerfiles.sh
Shooot, could not parse view as JSON.
Tips: functions are not valid JSON and keys / values must be surround with double quotes.

SyntaxError: Unexpected number in JSON at position 1
    at JSON.parse (<anonymous>)
    at parseView (/usr/lib/node_modules/mustache/bin/mustache:74:17)
    at onDone (/usr/lib/node_modules/mustache/bin/mustache:67:10)
    at Socket.onEnd (/usr/lib/node_modules/mustache/bin/mustache:118:5)
    at Object.onceWrapper (events.js:286:20)
    at Socket.emit (events.js:203:15)
    at endReadableNT (_stream_readable.js:1145:12)
    at process._tickCallback (internal/process/next_tick.js:63:19)
Shooot, could not parse view as JSON.
Tips: functions are not valid JSON and keys / values must be surround with double quotes.
:
:

Content file:

# cat generate_dockerfiles.sh

#!/bin/sh
cat <<EOF | mustache - Dockerfile.template > chrome/Dockerfile
---
chrome: true
debian: true
---
EOF
:
:

Version:

# node -v
v10.24.1
# npm -v
6.14.12
# mustache -v
4.2.0

I could not resolve the problem, But I did it another way.

I created a file config.json with this content:

{
"chrome": true,
"debian": true
}

And then I ran this:

# cat config.json | mustache - Dockerfile.template > chrome/Dockerfile

It's works.

But you may want to fix that error above.

Thanks for your nice project.

CURLOPT_USERAGENT ignored when using CURL_IMPERSONATE env var

Reported originally by @momala454 in #42 :

i have a small problem with your pull request (actually, with the whole libcurl version). When i'm using CURL_IMPERSONATE=chrome98, i'm unable to overwrite the default user-agent header to Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.75 Safari/537.36
it stays with Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36

when I set CURLOPT_USERAGENT, it does not overwrite the default user-agent set with "CURL_IMPERSONATE" env variable. It only reset it if I modify the header "user-agent" using CURLOPT_HTTPHEADER

Native Windows build

Write a script to build curl-impersonate natively on Windows. This will probably require building each of the dependencies (boringssl, nghttp2, brotli & curl) on Windows.

Response Headers

Hello there, Is it possible to get response headers ? the example on readme curl_chrome99 https://www.wikipedia.org only return body of the response, It would be nice if we could get full response with put some additional flag in cli, Some of websites return useful cookies.

415 Unsupported Media Type

Seems to be an encoding issue when adding Content-Type header with a space.

Doesnt work:
-H 'Content-Type: application/json'

Returns:
[{"status":"FAILURE","message":"HTTP 415 Unsupported Media Type","code":"{unsupported.media.type}","requestedUrl":"/search/template"}]curl: (6) Could not resolve host: application

Works:
-H 'Content-Type:application/json'

Impersonate command argument?

Source code already contains targets with impersonating options. How about to implement --impersonate <target> command line switch?

AUR Arch linux packages

This is very interesting project, congratulations to the devs!

I have created 2 AUR packages for it using the dockerfiles as reference, fetching the patches from this repo.
They will install the curl-impersonate-chrome and curl-impersonate-firefox commands. I haven't added the scripts to add the browser headers though since they didn't matter much to me.

Maybe you would be interested on adding this to your readme?

https://aur.archlinux.org/packages/curl-impersonate-chrome
https://aur.archlinux.org/packages/curl-impersonate-firefox

User agent client hints also sent in http

https://wicg.github.io/ua-client-hints/#security-privacy
Client Hints will not be delivered to non-secure endpoints (see the secure transport requirements in Section 2.2.1 of [[RFC8942]](https://wicg.github.io/ua-client-hints/#biblio-rfc8942)).

The headers sec-ch-xxx must not be sent when the url is http://, only https://
but if i set CURL_IMPERSONATE=chrome98 env variable, it will always set those use agent headers even on http

	putenv('CURL_IMPERSONATE=chrome98');
	$ch = curl_init();
	curl_setopt($ch, CURLOPT_URL, 'http://headers.cf');
	curl_setopt($ch, CURLINFO_HEADER_OUT, 1);
	curl_setopt( $ch, CURLOPT_ENCODING, "" );
	curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, false );
	curl_setopt( $ch, CURLOPT_ENCODING, "" );
	curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
	//curl_setopt( $ch, CURLOPT_HTTPHEADER, ['Host: abc.com']);
	curl_setopt( $ch, CURLOPT_AUTOREFERER, true );
	

	
	
	echo curl_exec($ch);
	print_r(curl_getinfo($ch));

(take note that the website redirect to https version, but we are not following the redirect)
Headers sent

GET / HTTP/1.1
Host: headers.cf
Connection: Upgrade, HTTP2-Settings
Upgrade: h2c
HTTP2-Settings: AAEAAQAAAAMAAAPoAAQAYAAAAAYABAAAjau_38Px
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9

real headers sent by chrome :

GET / HTTP/1.1
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Accept-Encoding: gzip, deflate
Accept-Language: fr
Connection: keep-alive
Host: headers.cf
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/100.0.4896.88 Safari/537.36

you can also see some differences, like, only on curl I see "HTTP2-Settings" headers.

Also, there are lot of user agent hints headers. Once the website tells you that they want that you send more headers, chrome will send them :
https://headers.cf/
Go to the website, a few headers are sent. Refresh the page, a lot of headers are sent. Curl-impersonate only send the minimal of the first request.
The browser keep in cache the list of headers that the domain wants.

I don't know also if the website send an header of "Accept-CH" empty, if chrome doesn't send the 3 base headers sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98" sec-ch-ua-mobile: ?0 sec-ch-ua-platform: "Windows" or if chrome still send them.
If chrome doesn't send them, that's another way to detect a spoofed chrome. But this only works on the second request as the browser must know which headers the domain support

Have the headers changed a bit with Chrome 103?

I checked my laptop's Windows Chrome 103 headers using your socat procedure, compared to the chrome101 curl-impersonate script's results I see extra headers for Connection:, Cache-Control:, and also DNT:. That last one might be due to my personal Chrome settings, not sure. Also the sec-ch-ua: string seems to be a bit different.

Chrome 103:

GET / HTTP/1.1\r
Host: xx.xx.xx.xx:8443\r
Connection: keep-alive\r
Cache-Control: max-age=0\r
sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"\r
sec-ch-ua-mobile: ?0\r
sec-ch-ua-platform: "Windows"\r
DNT: 1\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9\r
Sec-Fetch-Site: none\r
Sec-Fetch-Mode: navigate\r
Sec-Fetch-User: ?1\r
Sec-Fetch-Dest: document\r
Accept-Encoding: gzip, deflate, br\r
Accept-Language: en-US,en;q=0.9\r

curl-impersonate chrome101:

GET / HTTP/1.1\r
Host: localhost:8443\r
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"\r
sec-ch-ua-mobile: ?0\r
sec-ch-ua-platform: "Windows"\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9\r
Sec-Fetch-Site: none\r
Sec-Fetch-Mode: navigate\r
Sec-Fetch-User: ?1\r
Sec-Fetch-Dest: document\r
Accept-Encoding: gzip, deflate, br\r
Accept-Language: en-US,en;q=0.9\r

Related question: I tried setting custom headers via node-libcurl in my program, by passing an array to my .get() method's 'HTTPHEADER' option, like below, but in the resulting request socat is showing the Connection: and Cache-Control: headers end up at the end, despite my array passing them first. Is this a curl bug? Anyway to override it to get the right order like Chrome 103?

let headers = [
    'Connection: keep-alive',
    'Cache-Control: max-age=0',
    'sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"',
    'sec-ch-ua-mobile: ?0',
    'sec-ch-ua-platform: "Windows"',
    'Upgrade-Insecure-Requests: 1',
    'User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36',
    'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9',
    'Sec-Fetch-Site: none',
    'Sec-Fetch-Mode: navigate',
    'Sec-Fetch-User: ?1',
    'Sec-Fetch-Dest: document',
    'Accept-Encoding: gzip, deflate, br',
    'Accept-Language: en-US,en;q=0.9'
];
let response = await aCurly.get('https://localhost:8443/', { SSL_VERIFYPEER: false, SSL_VERIFYHOST: false, HTTPHEADER: headers });

socat result:

GET / HTTP/1.1\r
Host: localhost:8443\r
sec-ch-ua: ".Not/A)Brand";v="99", "Google Chrome";v="103", "Chromium";v="103"\r
sec-ch-ua-mobile: ?0\r
sec-ch-ua-platform: "Windows"\r
Upgrade-Insecure-Requests: 1\r
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36\r
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9\r
Sec-Fetch-Site: none\r
Sec-Fetch-Mode: navigate\r
Sec-Fetch-User: ?1\r
Sec-Fetch-Dest: document\r
Accept-Encoding: gzip, deflate, br\r
Accept-Language: en-US,en;q=0.9\r
Connection: keep-alive\r
Cache-Control: max-age=0\r

I tried editing the curl_chrome101 script to add these 2 headers in there, and same behavior: they end up at the bottom of the headers instead of top.

`curl_easy_impersonate()` adds Accept-Encoding to req. headers w/out enabling cURL's auto decompression

Hi,

Not sure if this is actually a feature, but calling curl_easy_impersonate(self.session, "ff98") adds Accept-Encoding: gzip, deflate, br to the request headers, as expected, Most sites will respond by sending back compressed content, but libcurl won't perform its automatic decompression on the contents.

While it's no problem to set CURLOPT_ACCEPT_ENCODING, before/after calling curl_easy_impersonate, to enable automatic decompression on received contents, it requires the coder to know what Accept-Encoding value curl_easy_impersonate sets.

"lwthiker/curl-impersonate:0.5.1-ff" is built using Alpine, not Debian

It seems that something is potentially off with your generate_dockerfiles.sh script, as the non-Alpine images are, in fact, also Alpine:

cromo@docker:~/rss-bridge/rss-bridge$ sudo docker run -it lwthiker/curl-impersonate:0.5.1-ff sh
/ # apk
apk-tools 2.12.7, compiled for x86_64.

usage: apk [<OPTIONS>...] COMMAND [<ARGUMENTS>...]

Package installation and removal:
  add        Add packages to WORLD and commit changes
  del        Remove packages from WORLD and commit changes

System maintenance:
  fix        Fix, reinstall or upgrade packages without modifying WORLD
  update     Update repository indexes
  upgrade    Install upgrades available from repositories
  cache      Manage the local package cache

Querying package information:
  info       Give detailed information about packages or repositories
  list       List packages matching a pattern or other criteria
  dot        Render dependencies as graphviz graphs
  policy     Show repository policy for packages
  search     Search for packages by name or description

Repository maintenance:
  index      Create repository index file from packages
  fetch      Download packages from global repositories to a local directory
  manifest   Show checksums of package contents
  verify     Verify package integrity and signature

Miscellaneous:
  audit      Audit system for changes
  stats      Show statistics about repositories and installations
  version    Compare package versions or perform tests on version strings

This apk has coffee making abilities.

Manually compile with Docker file

I tried according to this docker file:
https://github.com/lwthiker/curl-impersonate/blob/main/chrome/Dockerfile
Compile the program on my system.

# c++ --version
c++ (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0
Copyright (C) 2019 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

# lsb_release -a
No LSB modules are available.
Distributor ID:	Ubuntu
Description:	Ubuntu 20.04.4 LTS
Release:	20.04
Codename:	focal

# ninja --version
1.10.0

When I typed this command:

cmake -DCMAKE_BUILD_TYPE=Release -DCMAKE_POSITION_INDEPENDENT_CODE=on -GNinja ..

I got this error:

-- Checking for module 'libunwind-generic'
--   No package 'libunwind-generic' found
libunwind not found. Disabling unwind tests.
CMake Error at CMakeLists.txt:51 (message):
  Could not find Go

I was able to solve the problem with this command:

apt-get install -y libunwind-dev

It was strange that such a package is not installed in Docker!

Anyway, I went ahead and got this error after entering the ninja command:

# ninja
[5/105] Building CXX object crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o
FAILED: crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o 
/usr/bin/c++  -DBORINGSSL_HAVE_LIBUNWIND -DBORINGSSL_IMPLEMENTATION -I../third_party/googletest/include -I../crypto/../include -Werror -Wformat=2 -Wsign-compare -Wmissing-field-initializers -Wwrite-strings -Wvla -Wshadow -ggdb -Wall -fvisibility=hidden -fno-common -Wno-free-nonheap-object -Wimplicit-fallthrough -Wmissing-declarations -std=c++11 -fno-exceptions -fno-rtti -O3 -DNDEBUG -fPIC -MD -MT crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -MF crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o.d -o crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -c ../crypto/test/abi_test.cc
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
  211 |   write(STDERR_FILENO, buf, strlen(buf));
      |   ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
cc1plus: all warnings being treated as errors
[6/105] Building CXX object ssl/CMakeFiles/ssl_test.dir/span_test.cc.o
ninja: build stopped: subcommand failed.

This error appeared when I ran it again:

# ninja
[1/100] Building CXX object crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o
FAILED: crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o 
/usr/bin/c++  -DBORINGSSL_HAVE_LIBUNWIND -DBORINGSSL_IMPLEMENTATION -I../third_party/googletest/include -I../crypto/../include -Werror -Wformat=2 -Wsign-compare -Wmissing-field-initializers -Wwrite-strings -Wvla -Wshadow -ggdb -Wall -fvisibility=hidden -fno-common -Wno-free-nonheap-object -Wimplicit-fallthrough -Wmissing-declarations -std=c++11 -fno-exceptions -fno-rtti -O3 -DNDEBUG -fPIC -MD -MT crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -MF crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o.d -o crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -c ../crypto/test/abi_test.cc
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
  211 |   write(STDERR_FILENO, buf, strlen(buf));
      |   ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
cc1plus: all warnings being treated as errors
[2/100] Building CXX object ssl/CMakeFiles/ssl_test.dir/ssl_test.cc.o
ninja: build stopped: subcommand failed.

And next time again a new error:

# ninja
[3/99] Building CXX object crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o
FAILED: crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o 
/usr/bin/c++  -DBORINGSSL_HAVE_LIBUNWIND -DBORINGSSL_IMPLEMENTATION -I../third_party/googletest/include -I../crypto/../include -Werror -Wformat=2 -Wsign-compare -Wmissing-field-initializers -Wwrite-strings -Wvla -Wshadow -ggdb -Wall -fvisibility=hidden -fno-common -Wno-free-nonheap-object -Wimplicit-fallthrough -Wmissing-declarations -std=c++11 -fno-exceptions -fno-rtti -O3 -DNDEBUG -fPIC -MD -MT crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -MF crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o.d -o crypto/test/CMakeFiles/test_support_lib.dir/abi_test.cc.o -c ../crypto/test/abi_test.cc
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
  211 |   write(STDERR_FILENO, buf, strlen(buf));
      |   ~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
../crypto/test/abi_test.cc: In function โ€˜void abi_test::internal::FatalError(Args ...) [with Args = {const char*, const char*, const char*}]โ€™:
../crypto/test/abi_test.cc:211:8: error: ignoring return value of โ€˜ssize_t write(int, const void*, size_t)โ€™, declared with attribute warn_unused_result [-Werror=unused-result]
cc1plus: all warnings being treated as errors
[4/99] Building CXX object ssl/CMakeFiles/ssl.dir/d1_both.cc.o
ninja: build stopped: subcommand failed.

In fact, every time I execute the command, it gets an error on a new file!

How can I solve it?

add available browsers as json file

Hello, is it possible to create a json file where list all available browsers and current docker image

like

{
'docker': 'lwthiker/curl-impersonate:0.4',
'browsers': {
    'chrome98': 'chrome',
    'chrome99': 'chrome',
    'edge98': 'chrome',
    'edge99': 'chrome',

    'safari15_3': 'chrome',

    'ff91esr': 'ff',
    'ff95': 'ff',
    'ff98': 'ff'
}
}

Of course can add more details to each browser

Command returns gibberish

Hey,

I built the project using docker build -t curl-impersonate ., then I entered a shell with docker run --rm -it --entrypoint bash curl-impersonate.

When I run "vanilla" curl -L google.com, I get the expected human-readable response, but when I run /build/out/curl_ff95 -L google.com, I get garbled text.

Am I doing something wrong?

Thanks!

How can I use this library with PHP on Mac

Hi,

At first thank you for this great library๐Ÿ‘

I want to force PHP to use curl-impersonate instead of the standard curl library. How should I configure my system to achieve that? I see that there is a solution for Linux, but is there also a solution for macOS?

โžœ  ~ sw_vers
ProductName:	Mac OS X
ProductVersion:	10.15.7
BuildVersion:	19H2026
โžœ  ~ php -v
PHP 8.1.10 (cli) (built: Aug 30 2022 19:22:00) (NTS)
Copyright (c) The PHP Group
Zend Engine v4.1.10, Copyright (c) Zend Technologies
    with Zend OPcache v8.1.10, Copyright (c), by Zend Technologies
โžœ  ~ otool -L /usr/bin/curl
/usr/bin/curl:
	/usr/lib/libcurl.4.dylib (compatibility version 7.0.0, current version 9.0.0)
	/usr/lib/libz.1.dylib (compatibility version 1.0.0, current version 1.2.11)
	/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 1281.100.1)
โžœ  ~ 

support build on Amazon Linux(or centos, redhat linux)

thought this should be quite similar to build on Ubuntu, tried for hours but failed

my efforts include:

  • install build prerequisites by: sudo yum groupinstall "Development Tools"
  • upgrade cmake by: pip3 install cmake --upgrade
  • create symbolic link for ninjia-build by: ln -s /usr/bin/ninja-build /usr/bin/ninja
  • install nss by: sudo yum install libnss3.so libnssutil3.so
  • add search path(/usr/lib/) for lib nss in curl-impersonate.patch#:
    -+          search_paths="/usr/lib/$host /usr/lib/$host/nss"
    ++          search_paths="/usr/lib/ /usr/lib/$host /usr/lib/$host/nss"

still failed when doing make firefox-build, the log suggested failed linking NSS:

configure: WARNING: Using hard-wired libraries and compilation flags for NSS.
checking if libnssckbi is in a non-standard location... /usr/lib
checking for SSL_VersionRangeSet in -lnss_static... no
configure: error: Failed linking NSS statically
Makefile:198: recipe for target 'curl-7.81.0/.firefox' failed
make: *** [curl-7.81.0/.firefox] Error 1

the related log in ./curl-7.81.0/config.log is:

...
configure:26797: checking if libnssckbi is in a non-standard location
configure:26815: result: /usr/lib
configure:26843: checking for SSL_VersionRangeSet in -lnss_static
configure:26865: gcc -o conftest -I/home/ec2-user/apps/curl-impersonate/buil
d/nss-3.75/dist/Release/../public/nss -I/home/ec2-user/apps/curl-impersonate
/build/nss-3.75/dist/Release/include/nspr -Werror-implicit-function-declarat
ion -O2 -Wno-system-headers    -I/home/ec2-user/apps/curl-impersonate/build/
brotli-1.0.9/out/installed/include -I/home/ec2-user/apps/curl-impersonate/bu
ild/nss-3.75/dist/Release/include -L/home/ec2-user/apps/curl-impersonate/bui
ld/nss-3.75/dist/Release/lib -Wl,-rpath,/usr/lib    -L/home/ec2-user/apps/cu
rl-impersonate/build/brotli-1.0.9/out/installed/lib conftest.c -lnss_static
 -Wl,--start-group -lssl -lnss_static -lpk11wrap_static -lcertdb -lcerthi -l
nsspki -lnssdev -lsoftokn_static -lfreebl_static -lnssutil -lnssb -lcryptohi
 -l:libplc4.a -l:libplds4.a -l:libnspr4.a -lsqlite -lgcm-aes-x86_c_lib -lhw-
acc-crypto-avx -lhw-acc-crypto-avx2 -lsha-x86_c_lib -lintel-gcm-wrap_c_lib -
lintel-gcm-s_lib -Wl,--end-group -pthread -ldl -lbrotlidec-static -lbrotlico
mmon-static -lz  >&5
/usr/bin/ld: cannot find -lbrotlidec-static
/usr/bin/ld: cannot find -lbrotlicommon-static
collect2: error: ld returned 1 exit status
configure:26865: $? = 1
configure: failed program was:
| /* confdefs.h */
| #define PACKAGE_NAME "curl"
| #define PACKAGE_TARNAME "curl"
| #define PACKAGE_VERSION "-"
| #define PACKAGE_STRING "curl -"
| #define PACKAGE_BUGREPORT "a suitable curl mailing list: https://curl.se/mail/"
...

any help is appreciated

Sometimes it fails to bypass cloudflare

In the case of Chrome and Edge, sometimes (and not for always) it does not work for the particular site I was focusing on. But I did not encounter any problems with Firefox and Safari.
This is the main sample site: https://pegaxy.io/
You can specifically focus on this address: https://api.pegaxy.io/my/info
And I used it on Docker.
You can send consecutive requests and test.
By the way, I already asked about the details of what I want to do here:
https://stackoverflow.com/q/71529199/1407491

Here is an example of a log that works:

SHOW LOG
# docker run --rm lwthiker/curl-impersonate:0.3-chrome curl_chrome99 'https://api.pegaxy.io/my/info' \
>   -H 'authority: api.pegaxy.io' \
>   -H 'sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"' \
>   -H 'accept: application/json' \
>   -H 'sec-ch-ua-platform: "Windows"' \
>   -H 'sec-ch-ua-mobile: ?0' \
>   -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36' \
>   -H 'sec-ch-ua-platform: "Windows"' \
>   -H 'origin: https://play.pegaxy.io' \
>   -H 'sec-fetch-site: same-site' \
>   -H 'sec-fetch-mode: cors' \
>   -H 'sec-fetch-dest: empty' \
>   -H 'referer: https://play.pegaxy.io/marketplace' \
>   -H 'accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7' \
>   --compressed -s -vv
*   Trying 172.67.10.157:443...
* Connected to api.pegaxy.io (172.67.10.157) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* Cipher selection: TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,ECDHE-ECDSA-AES128-GCM-SHA256,ECDHE-RSA-AES128-GCM-SHA256,ECDHE-ECDSA-AES256-GCM-SHA384,ECDHE-RSA-AES256-GCM-SHA384,ECDHE-ECDSA-CHACHA20-POLY1305,ECDHE-RSA-CHACHA20-POLY1305,ECDHE-RSA-AES128-SHA,ECDHE-RSA-AES256-SHA,AES128-GCM-SHA256,AES256-GCM-SHA384,AES128-SHA,AES256-SHA
*  CAfile: /etc/ssl/certs/ca-certificates.crt
*  CApath: none
* ALPS, offering h2
} [5 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.2 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [19 bytes data]
* TLSv1.3 (IN), TLS handshake, Unknown (25):
{ [3156 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [80 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [36 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [36 bytes data]
* SSL connection using TLSv1.3 / TLS_CHACHA20_POLY1305_SHA256
* ALPN, server accepted to use h2
* Server certificate:
*  subject: CN=*.pegaxy.io
*  start date: Mar  3 05:22:24 2022 GMT
*  expire date: Jun  1 05:22:23 2022 GMT
*  subjectAltName: host "api.pegaxy.io" matched cert's "*.pegaxy.io"
*  issuer: C=US; O=Let's Encrypt; CN=E1
*  SSL certificate verify ok.
* Using HTTP2, server supports multiplexing
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
} [5 bytes data]
* Using Stream ID: 1 (easy handle 0x7f0ad96c1a90)
} [5 bytes data]
> GET /my/info HTTP/2
> Host: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> sec-ch-ua-mobile: ?0
> sec-ch-ua-platform: "Windows"
> upgrade-insecure-requests: 1
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
> sec-fetch-site: none
> sec-fetch-mode: navigate
> sec-fetch-user: ?1
> sec-fetch-dest: document
> accept-encoding: gzip, deflate, br
> accept-language: en-US,en;q=0.9
> authority: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> accept: application/json
> sec-ch-ua-mobile: ?0
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> sec-ch-ua-platform: "Windows"
> origin: https://play.pegaxy.io
> sec-fetch-site: same-site
> sec-fetch-mode: cors
> sec-fetch-dest: empty
> referer: https://play.pegaxy.io/marketplace
> accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Connection state changed (MAX_CONCURRENT_STREAMS == 256)!
} [5 bytes data]
< HTTP/2 200
< date: Mon, 21 Mar 2022 23:22:10 GMT
< content-type: application/json; charset=utf-8
< vary: Accept-Encoding
< vary: Origin
< access-control-allow-origin: https://play.pegaxy.io
< etag: W/"29-bVHj0ypH/h4ZX9esOoZbwspQiQY"
< x-frame-options: SAMEORIGIN
< x-xss-protection: 1; mode=block
< x-content-type-options: nosniff
< referrer-policy: no-referrer-when-downgrade
< content-security-policy: default-src 'self' http: https: data: blob: 'unsafe-inline'
< strict-transport-security: max-age=31536000; includeSubDomains
< cf-cache-status: DYNAMIC
< expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< set-cookie: __cf_bm=5alOaylVbD3I6nERRUHj3COnrAUKDaLA0KIZt9bE8ww-1647904930-0-AWqZ1lWXsTPZlGfyqlMjEtpAUJK8Qjbj8rDlMh2vHuuLEIWEH9vFrwcc/lCm4WBFkT/MMs68cv04GOiBTvbTALw=; path=/; expires=Mon, 21-Mar-22 23:52:10 GMT; domain=.pegaxy.io; HttpOnly; Secure; SameSite=None
< server: cloudflare
< cf-ray: 6efa6d987ed09b71-FRA
< content-encoding: br
<
{ [45 bytes data]
* Connection #0 to host api.pegaxy.io left intact
{"status":false,"error":"USER_NOT_FOUND"}

And here is a sample log for when it does not work:

SHOW LOG
# docker run --rm lwthiker/curl-impersonate:0.3-chrome curl_chrome99 'https://api.pegaxy.io/my/info' \
>   -H 'authority: api.pegaxy.io' \
>   -H 'sec-ch-ua-mobile: ?0' \
>   -H 'sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"' \
>   -H 'accept: application/json' \
>   -H 'sec-ch-ua-mobile: ?0' \
>   -H 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36' \
>   -H 'sec-ch-ua-platform: "Windows"' \
>   -H 'origin: https://play.pegaxy.io' \
>   -H 'sec-fetch-site: same-site' \
>   -H 'sec-fetch-mode: cors' \
>   -H 'sec-fetch-dest: empty' \
>   -H 'referer: https://play.pegaxy.io/marketplace' \
>   -H 'accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7' \
>   --compressed -s -vv
*   Trying 172.67.10.157:443...
* Connected to api.pegaxy.io (172.67.10.157) port 443 (#0)
* ALPN, offering h2
* ALPN, offering http/1.1
* Cipher selection: TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,ECDHE-ECDSA-AES128-GCM-SHA256,ECDHE-RSA-AES128-GCM-SHA256,ECDHE-ECDSA-AES256-GCM-SHA384,ECDHE-RSA-AES256-GCM-SHA384,ECDHE-ECDSA-CHACHA20-POLY1305,ECDHE-RSA-CHACHA20-POLY1305,ECDHE-RSA-AES128-SHA,ECDHE-RSA-AES256-SHA,AES128-GCM-SHA256,AES256-GCM-SHA384,AES128-SHA,AES256-SHA
*  CAfile: /etc/ssl/certs/ca-certificates.crt
*  CApath: none
* ALPS, offering h2
} [5 bytes data]
* TLSv1.2 (OUT), TLS handshake, Client hello (1):
} [512 bytes data]
* TLSv1.2 (IN), TLS handshake, Server hello (2):
{ [122 bytes data]
* TLSv1.3 (OUT), TLS change cipher, Change cipher spec (1):
} [1 bytes data]
* TLSv1.3 (IN), TLS handshake, Encrypted Extensions (8):
{ [19 bytes data]
* TLSv1.3 (IN), TLS handshake, Unknown (25):
{ [3156 bytes data]
* TLSv1.3 (IN), TLS handshake, CERT verify (15):
{ [78 bytes data]
* TLSv1.3 (IN), TLS handshake, Finished (20):
{ [36 bytes data]
* TLSv1.3 (OUT), TLS handshake, Finished (20):
} [36 bytes data]
* SSL connection using TLSv1.3 / TLS_CHACHA20_POLY1305_SHA256
* ALPN, server accepted to use h2
* Server certificate:
*  subject: CN=*.pegaxy.io
*  start date: Mar  3 05:22:24 2022 GMT
*  expire date: Jun  1 05:22:23 2022 GMT
*  subjectAltName: host "api.pegaxy.io" matched cert's "*.pegaxy.io"
*  issuer: C=US; O=Let's Encrypt; CN=E1
*  SSL certificate verify ok.
* Using HTTP2, server supports multiplexing
* Connection state changed (HTTP/2 confirmed)
* Copying HTTP/2 data in stream buffer to connection buffer after upgrade: len=0
} [5 bytes data]
* Using Stream ID: 1 (easy handle 0x7f396fbbda90)
} [5 bytes data]
> GET /my/info HTTP/2
> Host: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> sec-ch-ua-mobile: ?0
> sec-ch-ua-platform: "Windows"
> upgrade-insecure-requests: 1
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
> sec-fetch-site: none
> sec-fetch-mode: navigate
> sec-fetch-user: ?1
> sec-fetch-dest: document
> accept-encoding: gzip, deflate, br
> accept-language: en-US,en;q=0.9
> authority: api.pegaxy.io
> sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="99", "Google Chrome";v="99"
> accept: application/json
> sec-ch-ua-mobile: ?0
> user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36
> sec-ch-ua-platform: "Windows"
> origin: https://play.pegaxy.io
> sec-fetch-site: same-site
> sec-fetch-mode: cors
> sec-fetch-dest: empty
> referer: https://play.pegaxy.io/marketplace
> accept-language: en-US,en;q=0.9,fa;q=0.8,de;q=0.7
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [222 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Connection state changed (MAX_CONCURRENT_STREAMS == 256)!
} [5 bytes data]
< HTTP/2 403
< date: Mon, 21 Mar 2022 17:14:41 GMT
< content-type: text/html; charset=UTF-8
< cache-control: max-age=15
< expires: Mon, 21 Mar 2022 17:14:56 GMT
< x-frame-options: SAMEORIGIN
< expect-ct: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
< set-cookie: __cf_bm=nEnD.QUz3L43TqTxUdnxUWV7R3svGpN9CQ8MU3thu88-1647882881-0-AT/Vw/Y/DoqdLAESxkrplf95mmnU269etAJ8DpG5l//9sJ3+zDd8fC5iTyhD5x7trGkAsWonR5ErB3lSN+RuLvg=; path=/; expires=Mon, 21-Mar-22 17:44:41 GMT; domain=.pegaxy.io; HttpOnly; Secure; SameSite=None
< vary: Accept-Encoding
< server: cloudflare
< cf-ray: 6ef8534949b69b3f-FRA
< content-encoding: br
<
{ [922 bytes data]
* Connection #0 to host api.pegaxy.io left intact
<!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js ie6 oldie" lang="en-US"> <![endif]-->
<!--[if IE 7]>    <html class="no-js ie7 oldie" lang="en-US"> <![endif]-->
<!--[if IE 8]>    <html class="no-js ie8 oldie" lang="en-US"> <![endif]-->
<!--[if gt IE 8]><!--> <html class="no-js" lang="en-US"> <!--<![endif]-->
<head>
<title>Attention Required! | Cloudflare</title>
<meta charset="UTF-8" />
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<meta http-equiv="X-UA-Compatible" content="IE=Edge,chrome=1" />
<meta name="robots" content="noindex, nofollow" />
<meta name="viewport" content="width=device-width,initial-scale=1" />
<link rel="stylesheet" id="cf_styles-css" href="/cdn-cgi/styles/cf.errors.css" type="text/css" media="screen,projection" />
<!--[if lt IE 9]><link rel="stylesheet" id='cf_styles-ie-css' href="/cdn-cgi/styles/cf.errors.ie.css" type="text/css" media="screen,projection" /><![endif]-->
<style type="text/css">body{margin:0;padding:0}</style>


<!--[if gte IE 10]><!-->
<script>
  if (!navigator.cookieEnabled) {
    window.addEventListener('DOMContentLoaded', function () {
      var cookieEl = document.getElementById('cookie-alert');
      cookieEl.style.display = 'block';
    })
  }
</script>
<!--<![endif]-->


</head>
<body>
  <div id="cf-wrapper">
    <div class="cf-alert cf-alert-error cf-cookie-error" id="cookie-alert" data-translate="enable_cookies">Please enable cookies.</div>
    <div id="cf-error-details" class="cf-error-details-wrapper">
      <div class="cf-wrapper cf-header cf-error-overview">
        <h1 data-translate="block_headline">Sorry, you have been blocked</h1>
        <h2 class="cf-subheadline"><span data-translate="unable_to_access">You are unable to access</span> pegaxy.io</h2>
      </div><!-- /.header -->

      <div class="cf-section cf-highlight">
        <div class="cf-wrapper">
          <div class="cf-screenshot-container cf-screenshot-full">

              <span class="cf-no-screenshot error"></span>

          </div>
        </div>
      </div><!-- /.captcha-container -->

      <div class="cf-section cf-wrapper">
        <div class="cf-columns two">
          <div class="cf-column">
            <h2 data-translate="blocked_why_headline">Why have I been blocked?</h2>

            <p data-translate="blocked_why_detail">This website is using a security service to protect itself from online attacks. The action you just performed triggered the security solution. There are several actions that could trigger this block including submitting a certain word or phrase, a SQL command or malformed data.</p>
          </div>

          <div class="cf-column">
            <h2 data-translate="blocked_resolve_headline">What can I do to resolve this?</h2>

            <p data-translate="blocked_resolve_detail">You can email the site owner to let them know you were blocked. Please include what you were doing when this page came up and the Cloudflare Ray ID found at the bottom of this page.</p>
          </div>
        </div>
      </div><!-- /.section -->

      <div class="cf-error-footer cf-wrapper w-240 lg:w-full py-10 sm:py-4 sm:px-8 mx-auto text-center sm:text-left border-solid border-0 border-t border-gray-300">
  <p class="text-13">
    <span class="cf-footer-item sm:block sm:mb-1">Cloudflare Ray ID: <strong class="font-semibold">6ef8534949b69b3f</strong></span>
    <span class="cf-footer-separator sm:hidden">&bull;</span>
    <span class="cf-footer-item sm:block sm:mb-1"><span>Your IP</span>: xx.xx.xx.xx</span>
    <span class="cf-footer-separator sm:hidden">&bull;</span>
    <span class="cf-footer-item sm:block sm:mb-1"><span>Performance &amp; security by</span> <a rel="noopener noreferrer" href="https://www.cloudflare.com/5xx-error-landing" id="brand_link" target="_blank">Cloudflare</a></span>

  </p>
</div><!-- /.error-footer -->


    </div><!-- /#cf-error-details -->
  </div><!-- /#cf-wrapper -->

  <script type="text/javascript">
  window._cf_translation = {};


</script>

</body>
</html>

Thanks for great project.

Libcurl certificate compression & more ?

I'm using libcurl and php for curl-impersonate, i'm setting the ciphers, the ssl version, disable NPN, but i'm still missing the extensions 27 (compress_certificate) and 17513 (extensionApplicationSettings (boringssl))

I can't set the parameters "--alps" and "--cert-compression brotli" using libcurl and PHP. The curl php extension doesn't support new parameters names (see https://github.com/php/php-src/blob/master/ext/curl/interface.c#L2306 )
What should I do to enable those 2 extensions ?

edit:
calling putenv('CURL_IMPERSONATE=chrome98'); or CURL_IMPERSONATE=chrome98 php /path/to/script.php beforehand doesn't help, the two tls extensions are still not enabled.

My guess is that that both "alps" and "cert-compression" have not been yet added to the default parameters on curl-impersonate for libcurl.

Using CURL_IMPERSONATE=chrome98 curl "http://...." is correctly adding the 2 extensions however

User supplied HTTP headers are wrongly ordered when using libcurl-impersonate

When using libcurl-impersonate (either with CURL_IMPERSONATE env var or curl_easy_impersonate(), user-supplied HTTP headers will be placed after the built-in list of HTTP headers that libcurl-impersonate uses.

If the user supplies a HTTP header with the CURLOPT_HTTPHEADER option, it will either:

  • Replace the built-in header used for impersonation, if it's the same header (e.g. User-Agent).
  • Be placed after all the built-in headers used for impersonation.

The result is that the order of HTTP headers is not fully controllable by the user.

Example:
When impersonating Chrome 101, the following headers are added by default:

sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9

If the user sets for example the two headers X-Custom-User-Header and User-Agent in this order, the resulting list would look like:

sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="101", "Google Chrome";v="101"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: [ Custom user agent ]
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
X-Custom-User-Header: [ Custom user header ]

Thus X-Custom-User-Header is placed AFTER User-Agent even though the user requested the opposite.

This is quite tricky to solve, as it is not always clear how to combine the built-in headers and the user-supplied headers. My current thinking is to allow the user to disable the built-in headers altogether so that they can choose the order themselves.

Distribute binaries, including a drop-in replacement for the shared library

Thanks for publishing this.

I see the release process is fairly automated, is it possible to use CI to automatically build binaries of this, and publish them as releases?
I don't have experience in Github Actions, but I do in Gitlab CI.
Both containers take 3 GB, but the binaries themselves are only ~7 MB.

Another interesting build artefact would be to "bake in" the settings in curl_ff95 script in the build itself, and create a regular curl shared library. This could replace with the upstream libcurl file to use the new settings, or hacked around with LD_LIBRARY_PATH.
It would need to be a different build per browser.

Now curl is built in static mode. The library itself is in lib/.libs/libcurl.a.

Wrong host header using CURL_IMPERSONATE env var

When using libcurl and reusing the same connection, if I set the "Host:" header on the connection, and reuse it to make a request without the host header, the header is still included with the same value

<?php
putenv('CURL_IMPERSONATE=chrome98');
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://headers.cf');
curl_setopt($ch, CURLINFO_HEADER_OUT, 1);
curl_setopt( $ch, CURLOPT_ENCODING, "" );
curl_setopt( $ch, CURLOPT_FOLLOWLOCATION, false );
curl_setopt( $ch, CURLOPT_ENCODING, "" );
curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true );
curl_setopt( $ch, CURLOPT_HTTPHEADER, ['Host: abc.com']);
curl_setopt( $ch, CURLOPT_AUTOREFERER, true );

curl_exec($ch);
print_r(curl_getinfo($ch));

//curl_reset($ch);
curl_setopt($ch, CURLOPT_URL, 'https://headers.cf');
curl_setopt( $ch, CURLOPT_HTTPHEADER, ['connection: Keep-Alive']); // i didn't set "host:" there



echo curl_exec($ch);
print_r(curl_getinfo($ch));

ON the first request, this is what is sent

GET / HTTP/1.1
Host: abc.com <--- notice this
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9

and this is sent on the second request

GET / HTTP/1.1
Host: abc.com <--- this is incorrect
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="98", "Google Chrome";v="98"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
connection: Keep-Alive

if i remove the line putenv('CURL_IMPERSONATE=chrome98');, everything works fine :
first request :

GET / HTTP/1.1
Host: abc.com <-- notice this
Accept: */*
Accept-Encoding: deflate, gzip, br

second request

GET / HTTP/1.1
Host: headers.cf <--- this is correct this time
Accept: */*
Accept-Encoding: deflate, gzip, br
connection: Keep-Alive

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.