linux:plowshare
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
linux:plowshare [2014/05/30 04:12] – [debug plowshare commands] admin | linux:plowshare [2022/10/29 16:15] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
+ | ====== Plowshare ====== | ||
+ | plowshare is a command-line (CLI) download/ | ||
+ | ===== download and install ===== | ||
+ | <code bash> | ||
+ | git clone https:// | ||
+ | make | ||
+ | </ | ||
+ | ===== plowshare examples ===== | ||
+ | ==== plowdown examples ==== | ||
+ | * Download a file from RapidShare:< | ||
+ | plowdown http:// | ||
+ | </ | ||
+ | * Download a file from HotFile using an account (free or premium):< | ||
+ | plowdown -a myuser: | ||
+ | </ | ||
+ | * Download a file from Oron with Antigate.com service (feature added since 2012.02.01):< | ||
+ | plowdown --antigate=key http:// | ||
+ | </ | ||
+ | * Download a file from Oron with Death by Captcha service (feature added since 2012.05.04):< | ||
+ | plowdown --deathbycaptcha=' | ||
+ | </ | ||
+ | * Download a file from RapidShare with a proxy. curl supports http_proxy and https_proxy environment variables (notice that 3128 is the default port).< | ||
+ | export http_proxy=http:// | ||
+ | plowdown http:// | ||
+ | </ | ||
+ | * Download a list of links (one link per line):< | ||
+ | cat file_with_links.txt | ||
+ | # This is a comment | ||
+ | http:// | ||
+ | http:// | ||
+ | plowdown file_with_links.txt | ||
+ | </ | ||
+ | * Download a list of links (one link per line) commenting out (with #) those successfully downloaded:< | ||
+ | plowdown -m file_with_links.txt | ||
+ | </ | ||
+ | * Limit the download rate (in bytes per second). Accepted prefixes are k, K, Ki, M, m, Mi:<code bash> | ||
+ | plowdown --max-rate 900K http:// | ||
+ | </ | ||
+ | * Download a password-protected link from Mediafire:< | ||
+ | plowdown -p somepassword http:// | ||
+ | </ | ||
+ | * Avoid never-ending downloads: limit the number of tries (for captchas) and wait delays for each link:< | ||
+ | plowdown --max-retries=20 --timeout=3600 ... | ||
+ | </ | ||
+ | ==== plowup examples ==== | ||
+ | Upload | ||
+ | * Upload a file to your RapidShare account:< | ||
+ | plowup --auth=myuser: | ||
+ | </ | ||
+ | * Upload a file to RapidShare anonymously changing uploaded file name:< | ||
+ | plowup rapidshare / | ||
+ | </ | ||
+ | * Upload a file to TurboBit with an account (premium or free):< | ||
+ | plowup -a myuser: | ||
+ | </ | ||
+ | * Upload a bunch of files (anonymously to 2Shared):< | ||
+ | plowup 2shared / | ||
+ | Notice that only files will be sent, subdirectories will be ignored. | ||
+ | </ | ||
+ | * Upload a file to megashares (anonymously) + set description< | ||
+ | plowup -d ' | ||
+ | </ | ||
+ | * Upload a file to Zshare anonymously with a proxy.< | ||
+ | export http_proxy=http:// | ||
+ | export https_proxy=http:// | ||
+ | plowup zshare / | ||
+ | </ | ||
+ | * Abort slow upload (if rate is below limit during 30 seconds)< | ||
+ | plowup --min-rate 100k mediafire / | ||
+ | </ | ||
+ | * Modify remote filenames (example: foobar.rar gives foobar-PLOW.rar)< | ||
+ | plowup --name=' | ||
+ | </ | ||
+ | ==== plowlist examples ==== | ||
+ | * List links contained in a shared folder link and download them all:< | ||
+ | plowlist http:// | ||
+ | plowdown -m links.txt | ||
+ | </ | ||
+ | * List some sendspace.com web folder. Render results for vBulletin " | ||
+ | plowlist --printf ' | ||
+ | </ | ||
+ | * List links contained in a dummy web page. Render results as HTML list:< | ||
+ | plowlist --fallback --printf '< | ||
+ | </ | ||
+ | ==== real examples for mediafire ==== | ||
+ | === prepare config for mediafire === | ||
+ | * Disable ssl mode | ||
+ | {{: | ||
+ | * share mediafire for everyone | ||
+ | {{: | ||
+ | * config create app for developer to upload file | ||
+ | {{: | ||
+ | === plowshare commands === | ||
+ | * plownlist< | ||
+ | plowlist http:// | ||
+ | Retrieving list (mediafire): | ||
+ | # AOE2.zip | ||
+ | http:// | ||
+ | # avast_free_antivirus_setup.exe | ||
+ | http:// | ||
+ | # Beyond Compare 3.1.3 Build 10374 + Serial Key.rar | ||
+ | http:// | ||
+ | # dropbox-2-4-6.zip | ||
+ | http:// | ||
+ | # FshareSetup_4.7.0.exe | ||
+ | http:// | ||
+ | # Linux System Administration.pdf | ||
+ | http:// | ||
+ | # MediaFireDesktop-0.10.50.9468-windows-PRODUCTION.exe | ||
+ | http:// | ||
+ | # navicat8_mysql_en.zip | ||
+ | http:// | ||
+ | # OneDriveSetup.exe | ||
+ | http:// | ||
+ | # Sparx Enterprise Architect v9.0.0.908.rar | ||
+ | http:// | ||
+ | </ | ||
+ | * plowndown | ||
+ | * plownup | ||
+ | |||
+ | ===== Basic knowlege ===== | ||
+ | ==== Using basic commands(non script) from plowshare source ==== | ||
+ | === debug plowlist === | ||
+ | debug below command: | ||
+ | <code bash> | ||
+ | bash -x / | ||
+ | </ | ||
+ | => Check curl and php call | ||
+ | * command 1: <code bash> | ||
+ | curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent ' | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | http:// | ||
+ | </ | ||
+ | < | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | * command 2: <code bash> | ||
+ | curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent ' | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | 4d737c99c02ac5f3b81b95ec32c2d81f099c1b91f7c505f9cebad2a01f0a7598 | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | http:// | ||
+ | </ | ||
+ | </ | ||
+ | </ | ||
+ | .................... | ||
+ | </ | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | === debug plowdown === | ||
+ | debug below command: | ||
+ | <code bash> | ||
+ | bash -x / | ||
+ | </ | ||
+ | => Check curl and php call | ||
+ | * command 1: <code bash> | ||
+ | curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent ' | ||
+ | HTTP/1.1 301 | ||
+ | Date: Fri, 30 May 2014 04:20:28 GMT | ||
+ | Content-Type: | ||
+ | Connection: close | ||
+ | Cache-control: | ||
+ | Expires: 0 | ||
+ | Location: / | ||
+ | Pragma: no-cache | ||
+ | Set-Cookie: ukey=ue25ueitucwb8fbolgu8dpn869nur89o; | ||
+ | Server: MediaFire | ||
+ | Access-Control-Allow-Origin: | ||
+ | </ | ||
+ | * command 2: <code bash> | ||
+ | curl --insecure --compressed --speed-time 600 --connect-timeout 240 --user-agent ' | ||
+ | ==== Using basic commands base on mediafire API ==== | ||
+ | refer: http:// | ||
+ | === Get Login Token === | ||
+ | <code bash> | ||
+ | curl -k " | ||
+ | </ | ||
+ | <code xml> | ||
+ | < | ||
+ | </ | ||
+ | === Get Session Token === | ||
+ | <code bash> | ||
+ | curl -k " | ||
+ | </ | ||
+ | <code xml> | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | b781767140c809f2c7fea45275161504ac29a545ee4d451d4e25f075de67c45f93b22945df3eaf005fa92436ca80a496eab36332f5d401a84a6e31e2448c02606da9c23fce60a5bb | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | === Get mediafire setting === | ||
+ | <code bash> | ||
+ | curl " | ||
+ | </ | ||
+ | <code xml> | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | === Get folder information === | ||
+ | <code bash> | ||
+ | curl " | ||
+ | </ | ||
+ | <code xml> | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | === Get list files in folder === | ||
+ | <code bash> | ||
+ | curl " | ||
+ | </ | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | ed34cb6a33372502ef61aa949fc58fe643a1d8cf830e2a455ab4bcb49759ceda | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | http:// | ||
+ | </ | ||
+ | </ | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | 0b7c1953ef9aa4e396dfef944b392d7ae094ffb781c121b89d0e19ee01c6eb8b | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | http:// | ||
+ | </ | ||
+ | </ | ||
+ | </ | ||
+ | ................. | ||
+ | </ | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | <code xml> | ||
+ | </ | ||
+ | === Get direct download link base on quickey === | ||
+ | <code bash> | ||
+ | curl " | ||
+ | </ | ||
+ | <code xml> | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | </ | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | === add web upload === | ||
+ | <code bash> | ||
+ | curl " | ||
+ | </ | ||
+ | <code xml> | ||
+ | <?xml version=" | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | < | ||
+ | </ | ||
+ | </ | ||
+ | |||
+ | ==== make file for install ==== | ||
+ | ===== plowshare core ===== | ||
+ | ==== post_login ==== | ||
+ | <code bash> | ||
+ | post_login() { | ||
+ | local -r AUTH=$1 | ||
+ | local -r COOKIE=$2 | ||
+ | local -r POSTDATA=$3 | ||
+ | local -r LOGIN_URL=$4 | ||
+ | shift 4 | ||
+ | local -a CURL_ARGS=(" | ||
+ | local USER PASSWORD DATA RESULT | ||
+ | |||
+ | if [ -z " | ||
+ | log_error " | ||
+ | return $ERR_LOGIN_FAILED | ||
+ | fi | ||
+ | |||
+ | if [ -z " | ||
+ | log_error " | ||
+ | return $ERR_LOGIN_FAILED | ||
+ | fi | ||
+ | |||
+ | # Seem faster than | ||
+ | # IFS=":" | ||
+ | USER=$(echo " | ||
+ | PASSWORD=$(echo " | ||
+ | |||
+ | if [ -z " | ||
+ | PASSWORD=$(prompt_for_password) || true | ||
+ | fi | ||
+ | |||
+ | log_notice " | ||
+ | |||
+ | DATA=$(eval echo " | ||
+ | RESULT=$(curl --cookie-jar " | ||
+ | " | ||
+ | |||
+ | # " | ||
+ | if [ ! -s " | ||
+ | log_debug " | ||
+ | return $ERR_LOGIN_FAILED | ||
+ | fi | ||
+ | |||
+ | log_report '=== COOKIE BEGIN ===' | ||
+ | logcat_report " | ||
+ | log_report '=== COOKIE END ===' | ||
+ | |||
+ | if ! find_in_array CURL_ARGS[@] ' | ||
+ | echo " | ||
+ | fi | ||
+ | } | ||
+ | </ | ||
+ | ==== parse ==== | ||
+ | <code bash> | ||
+ | # Get lines that match filter+parse regular expressions and extract string from it. | ||
+ | # | ||
+ | # $1: regexp to filter (take lines matching $1 pattern; " | ||
+ | # $2: regexp to parse (must contain parentheses to capture text). Example: " | ||
+ | # $3: (optional) how many lines to skip (default is 0: filter and match regexp on same line). | ||
+ | # Note: $3 may only be used if line filtering is active ($1 != " | ||
+ | # | ||
+ | # | ||
+ | # stdin: text data | ||
+ | # stdout: result | ||
+ | parse_all() { | ||
+ | local PARSE=$2 | ||
+ | local -i N=${3:-0} | ||
+ | local -r D=$' | ||
+ | local STRING FILTER | ||
+ | |||
+ | if [ -n " | ||
+ | FILTER=" | ||
+ | else | ||
+ | [ $N -eq 0 ] || return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | [ ' | ||
+ | [ ' | ||
+ | PARSE=" | ||
+ | |||
+ | if [ $N -eq 0 ]; then | ||
+ | # STRING=$(sed -ne "/$1/ s/ | ||
+ | STRING=$(sed -ne " | ||
+ | |||
+ | elif [ $N -eq 1 ]; then | ||
+ | # Note: Loop (with label) is required for consecutive matches | ||
+ | # STRING=$(sed -ne ":a /$1/ {n;h; s/$2/\1/p; g; | ||
+ | STRING=$(sed -ne ":a $FILTER {n;h; $PARSE; g; | ||
+ | |||
+ | elif [ $N -eq -1 ]; then | ||
+ | # STRING=$(sed -ne "/$1/ {x; s/$2/\1/p; b;}" -e ' | ||
+ | STRING=$(sed -ne " | ||
+ | |||
+ | else | ||
+ | local -r FIRST_LINE=' | ||
+ | local -r LAST_LINE=' | ||
+ | local N_ABS=$(( N < 0 ? -N : N )) | ||
+ | local I=$(( N_ABS - 2 )) # Note: N_ABS >= 2 due to " | ||
+ | local LINES=' | ||
+ | local INIT=' | ||
+ | local FILTER_LINE PARSE_LINE | ||
+ | |||
+ | [ $N_ABS -gt 10 ] && | ||
+ | log_notice " | ||
+ | |||
+ | while (( I-- )); do | ||
+ | INIT+='; | ||
+ | done | ||
+ | |||
+ | while (( N_ABS-- )); do | ||
+ | LINES+=' | ||
+ | done | ||
+ | |||
+ | if [ $N -gt 0 ]; then | ||
+ | FILTER_LINE=$FIRST_LINE | ||
+ | PARSE_LINE=$LAST_LINE | ||
+ | else | ||
+ | FILTER_LINE=$LAST_LINE | ||
+ | PARSE_LINE=$FIRST_LINE | ||
+ | fi | ||
+ | |||
+ | STRING=$(sed -ne "1 {$INIT; | ||
+ | -e " | ||
+ | -e " | ||
+ | -e " | ||
+ | |||
+ | # Explanation: | ||
+ | # as well as the previous N lines | ||
+ | # [3] let pattern space contain only the line we test filter regex | ||
+ | # on (i.e. first buffered line on skip > 0, last line on skip < 0) | ||
+ | # [4] if filter regex matches, let pattern space contain the line to | ||
+ | # be parsed and apply parse command | ||
+ | fi | ||
+ | |||
+ | if [ -z " | ||
+ | log_error " | ||
+ | log_notice_stack | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | echo " | ||
+ | } | ||
+ | |||
+ | # Like parse_all, but get only first match | ||
+ | parse() { | ||
+ | local PARSE=$2 | ||
+ | local -i N=${3:-0} | ||
+ | local -r D=$' | ||
+ | local STRING FILTER | ||
+ | |||
+ | if [ -n " | ||
+ | FILTER=" | ||
+ | else | ||
+ | [ $N -eq 0 ] || return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | [ ' | ||
+ | [ ' | ||
+ | PARSE=" | ||
+ | |||
+ | if [ $N -eq 0 ]; then | ||
+ | # Note: This requires GNU sed (which is assumed by Plowshare4) | ||
+ | # | ||
+ | STRING=$(sed -ne " | ||
+ | |||
+ | elif [ $N -eq 1 ]; then | ||
+ | # | ||
+ | STRING=$(sed -ne ":a $FILTER {n; | ||
+ | |||
+ | elif [ $N -eq -1 ]; then | ||
+ | # | ||
+ | STRING=$(sed -ne " | ||
+ | |||
+ | else | ||
+ | local -r FIRST_LINE=' | ||
+ | local -r LAST_LINE=' | ||
+ | local N_ABS=$(( N < 0 ? -N : N )) | ||
+ | local I=$(( N_ABS - 2 )) | ||
+ | local LINES=' | ||
+ | local INIT=' | ||
+ | local FILTER_LINE PARSE_LINE | ||
+ | |||
+ | [ $N_ABS -gt 10 ] && | ||
+ | log_notice " | ||
+ | |||
+ | while (( I-- )); do | ||
+ | INIT+='; | ||
+ | done | ||
+ | |||
+ | while (( N_ABS-- )); do | ||
+ | LINES+=' | ||
+ | done | ||
+ | |||
+ | if [ $N -gt 0 ]; then | ||
+ | FILTER_LINE=$FIRST_LINE | ||
+ | PARSE_LINE=$LAST_LINE | ||
+ | else | ||
+ | FILTER_LINE=$LAST_LINE | ||
+ | PARSE_LINE=$FIRST_LINE | ||
+ | fi | ||
+ | |||
+ | # Note: Need to " | ||
+ | STRING=$(sed -ne "1 {$INIT; | ||
+ | -e " | ||
+ | -e " | ||
+ | -e " | ||
+ | fi | ||
+ | |||
+ | if [ -z " | ||
+ | log_error " | ||
+ | log_notice_stack | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | echo " | ||
+ | } | ||
+ | </ | ||
+ | ==== parse_json ==== | ||
+ | <code bash> | ||
+ | # Simple and limited JSON parsing | ||
+ | # | ||
+ | # Notes: | ||
+ | # - Single line parsing oriented (user should strip newlines first): no tree model | ||
+ | # - Array and Object types: basic poor support (depth 1 without complex types) | ||
+ | # - String type: no support for escaped unicode characters (\uXXXX) | ||
+ | # - No non standard C/C++ comments handling (like in JSONP) | ||
+ | # - If several entries exist on same line: last occurrence is taken, but: | ||
+ | # | ||
+ | # - If several entries exist on different lines: all are returned (it's a parse_all_json) | ||
+ | # | ||
+ | # $1: variable name (string) | ||
+ | # $2: (optional) preprocess option. Accepted values are: | ||
+ | # - " | ||
+ | # - " | ||
+ | # stdin: JSON data | ||
+ | # stdout: result | ||
+ | parse_json() { | ||
+ | local -r NAME=" | ||
+ | local STRING PRE | ||
+ | local -r END=' | ||
+ | |||
+ | if [ " | ||
+ | PRE=" | ||
+ | elif [ " | ||
+ | PRE=sed\ -e\ ' | ||
+ | else | ||
+ | PRE=' | ||
+ | fi | ||
+ | |||
+ | # Note: " | ||
+ | STRING=$($PRE | sed \ | ||
+ | -ne "/ | ||
+ | -ne "/ | ||
+ | -ne " | ||
+ | -ne " | ||
+ | -ne " | ||
+ | |||
+ | if [ -z " | ||
+ | log_error " | ||
+ | log_notice_stack | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | # Translate two-character sequence escape representations | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | STRING=${STRING// | ||
+ | |||
+ | echo " | ||
+ | } | ||
+ | </ | ||
+ | ===== plowshare for mediafire ===== | ||
+ | ==== mediafire login ==== | ||
+ | === mediafire web === | ||
+ | == login form == | ||
+ | <code html> | ||
+ | <div id=" | ||
+ | <div class=" | ||
+ | <div id=" | ||
+ | <div class=" | ||
+ | <form id=" | ||
+ | <label class=" | ||
+ | <input type=" | ||
+ | <label class=" | ||
+ | <input type=" | ||
+ | <div id=" | ||
+ | <input type=" | ||
+ | <label for=" | ||
+ | </ | ||
+ | <a class=" | ||
+ | <button type=" | ||
+ | </ | ||
+ | </ | ||
+ | </ | ||
+ | => post url: / | ||
+ | == debug post login send == | ||
+ | * Header< | ||
+ | (Request-Line) POST / | ||
+ | Host www.mediafire.com | ||
+ | </ | ||
+ | * Post data< | ||
+ | login_email itanhchi@yahoo.com | ||
+ | login_pass xxxxxxx | ||
+ | login_remember on | ||
+ | </ | ||
+ | == debug reponse code for login == | ||
+ | compare reponse OK and response fail | ||
+ | {{: | ||
+ | var et= 15 | ||
+ | => Login OK\\ | ||
+ | var fp=' | ||
+ | => username | ||
+ | === mediafire login code === | ||
+ | <code bash> | ||
+ | mediafire_download() { | ||
+ | local -r COOKIE_FILE=$1 | ||
+ | local -r BASE_URL=' | ||
+ | local FILE_ID URL PAGE JSON JS_VAR | ||
+ | |||
+ | if [ -n " | ||
+ | mediafire_login " | ||
+ | fi | ||
+ | ................. | ||
+ | } | ||
+ | # Static function. Proceed with login | ||
+ | # $1: authentication | ||
+ | # $2: cookie file | ||
+ | # $3: base URL | ||
+ | mediafire_login() { | ||
+ | local -r AUTH_FREE=$1 | ||
+ | local -r COOKIE_FILE=$2 | ||
+ | local -r BASE_URL=$3 | ||
+ | local -r ENC_BASE_URL=$(uri_encode_strict <<< | ||
+ | local LOGIN_DATA PAGE CODE NAME | ||
+ | |||
+ | # Make sure we have " | ||
+ | curl -c " | ||
+ | |||
+ | # Notes: - " | ||
+ | # - force SSLv3 to avoid problems with curl using OpenSSL/ | ||
+ | LOGIN_DATA=' | ||
+ | PAGE=$(post_login " | ||
+ | " | ||
+ | -b " | ||
+ | |||
+ | # Note: Cookies " | ||
+ | CODE=$(echo " | ||
+ | NAME=$(echo " | ||
+ | |||
+ | # Check for errors | ||
+ | # Note: All error codes are explained in page returned by server. | ||
+ | if [ $CODE -ne 15 ]; then | ||
+ | log_debug " | ||
+ | return $ERR_LOGIN_FAILED | ||
+ | fi | ||
+ | |||
+ | log_debug " | ||
+ | } | ||
+ | </ | ||
+ | ==== mediafire download ==== | ||
+ | * download.sh< | ||
+ | local FUNCTION=${MODULE}_download | ||
+ | $FUNCTION " | ||
+ | </ | ||
+ | * mediafire.sh< | ||
+ | # Output a mediafire file download URL | ||
+ | # $1: cookie file | ||
+ | # $2: mediafire.com url | ||
+ | # stdout: real file download link | ||
+ | mediafire_download() { | ||
+ | local -r COOKIE_FILE=$1 | ||
+ | local -r BASE_URL=' | ||
+ | local FILE_ID URL PAGE JSON JS_VAR | ||
+ | |||
+ | if [ -n " | ||
+ | mediafire_login " | ||
+ | fi | ||
+ | |||
+ | FILE_ID=$(mediafire_extract_id " | ||
+ | |||
+ | if ! mediafire_is_file_id " | ||
+ | log_error 'This is a folder link. Please use plowlist!' | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | # Only get site headers first to capture direct download links | ||
+ | URL=$(curl --head " | ||
+ | |||
+ | case " | ||
+ | # no redirect, normal download | ||
+ | '' | ||
+ | URL=" | ||
+ | ;; | ||
+ | / | ||
+ | URL=" | ||
+ | ;; | ||
+ | http://*) | ||
+ | log_debug ' | ||
+ | echo " | ||
+ | return 0 | ||
+ | ;; | ||
+ | *errno=999) | ||
+ | return $ERR_LINK_NEED_PERMISSIONS | ||
+ | ;; | ||
+ | *errno=320|*errno=378) | ||
+ | return $ERR_LINK_DEAD | ||
+ | ;; | ||
+ | *errno=*) | ||
+ | log_error " | ||
+ | return $ERR_FATAL | ||
+ | esac | ||
+ | |||
+ | PAGE=$(curl -b " | ||
+ | |||
+ | # <h3 class=" | ||
+ | match ' | ||
+ | |||
+ | # handle captcha (reCaptcha or SolveMedia) if there is one | ||
+ | if match '< | ||
+ | local FORM_CAPTCHA PUBKEY CHALLENGE ID RESP CAPTCHA_DATA | ||
+ | |||
+ | FORM_CAPTCHA=$(grep_form_by_name " | ||
+ | |||
+ | if match ' | ||
+ | log_debug ' | ||
+ | |||
+ | local WORD | ||
+ | PUBKEY=' | ||
+ | RESP=$(recaptcha_process $PUBKEY) || return | ||
+ | { read WORD; read CHALLENGE; read ID; } <<< | ||
+ | |||
+ | CAPTCHA_DATA=" | ||
+ | |||
+ | elif match ' | ||
+ | log_debug 'Solve Media CAPTCHA found' | ||
+ | |||
+ | PUBKEY=' | ||
+ | RESP=$(solvemedia_captcha_process $PUBKEY) || return | ||
+ | { read CHALLENGE; read ID; } <<< | ||
+ | |||
+ | CAPTCHA_DATA=" | ||
+ | |||
+ | else | ||
+ | log_error ' | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | log_debug " | ||
+ | |||
+ | PAGE=$(curl --location -b " | ||
+ | $CAPTCHA_DATA " | ||
+ | |||
+ | # Your entry was incorrect, please try again! | ||
+ | if match 'Your entry was incorrect' | ||
+ | captcha_nack $ID | ||
+ | log_error 'Wrong captcha' | ||
+ | return $ERR_CAPTCHA | ||
+ | fi | ||
+ | |||
+ | captcha_ack $ID | ||
+ | log_debug ' | ||
+ | fi | ||
+ | |||
+ | # Check for password protected link | ||
+ | if match ' | ||
+ | log_debug 'File is password protected' | ||
+ | if [ -z " | ||
+ | LINK_PASSWORD=$(prompt_for_password) || return | ||
+ | fi | ||
+ | PAGE=$(curl -L --post301 -b " | ||
+ | -d " | ||
+ | |||
+ | match ' | ||
+ | fi | ||
+ | |||
+ | JS_VAR=$(echo " | ||
+ | |||
+ | # extract + output download link + file name | ||
+ | mediafire_get_ofuscated_link " | ||
+ | if ! parse_attr ' | ||
+ | parse_tag ' | ||
+ | fi | ||
+ | } | ||
+ | |||
+ | </ | ||
+ | ==== mediafire upload ==== | ||
+ | Refer: https:// | ||
+ | === upload code === | ||
+ | * [upload.sh]< | ||
+ | FUNCTION=${MODULE}_upload | ||
+ | $FUNCTION " | ||
+ | " | ||
+ | </ | ||
+ | * [mediafire.sh]< | ||
+ | # Upload a file to mediafire using official API. | ||
+ | # https:// | ||
+ | # $1: cookie file (unused here) | ||
+ | # $2: input file (with full path) | ||
+ | # $3: remote filename | ||
+ | # stdout: mediafire.com download link | ||
+ | mediafire_upload() { | ||
+ | local -r COOKIE_FILE=$1 | ||
+ | local -r FILE=$2 | ||
+ | local -r DEST_FILE=$3 | ||
+ | local -r BASE_URL=' | ||
+ | local SESSION_TOKEN JSON RES KEY_ID UPLOAD_KEY QUICK_KEY FOLDER_KEY | ||
+ | |||
+ | # Sanity checks | ||
+ | [ -n " | ||
+ | |||
+ | if [ -n " | ||
+ | log_error ' | ||
+ | return $ERR_BAD_COMMAND_LINE | ||
+ | fi | ||
+ | |||
+ | if [ -n " | ||
+ | -n " | ||
+ | log_error ' | ||
+ | return $ERR_BAD_COMMAND_LINE | ||
+ | fi | ||
+ | |||
+ | # FIXME | ||
+ | if [ -z " | ||
+ | log_error ' | ||
+ | return $ERR_BAD_COMMAND_LINE | ||
+ | fi | ||
+ | |||
+ | SESSION_TOKEN=$(mediafire_api_get_session_token " | ||
+ | log_debug " | ||
+ | |||
+ | # API bug | ||
+ | if [ " | ||
+ | log_error ' | ||
+ | fi | ||
+ | |||
+ | if [ -n " | ||
+ | FOLDER_KEY=$(mediafire_check_folder " | ||
+ | fi | ||
+ | |||
+ | # Check for duplicate name | ||
+ | JSON=$(curl --get -d " | ||
+ | -d ' | ||
+ | -d ' | ||
+ | | ||
+ | " | ||
+ | |||
+ | RES=$(parse_json result <<<" | ||
+ | if [ " | ||
+ | local NUM MSG | ||
+ | NUM=$(parse_json_quiet error <<<" | ||
+ | MSG=$(parse_json_quiet message <<<" | ||
+ | log_error " | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | # " | ||
+ | # Note: " | ||
+ | QUICK_KEY=$(parse_json_quiet ' | ||
+ | if [ -n " | ||
+ | if [ -n " | ||
+ | log_error ' | ||
+ | echo " | ||
+ | return 0 | ||
+ | else | ||
+ | log_debug 'a file with the same filename already exists. File will be renamed.' | ||
+ | fi | ||
+ | fi | ||
+ | |||
+ | # " | ||
+ | RES=$(parse_json storage_limit_exceeded <<<" | ||
+ | if [ " | ||
+ | | ||
+ | | ||
+ | fi | ||
+ | |||
+ | # Start upload | ||
+ | if match_remote_url " | ||
+ | JSON=$(curl -d " | ||
+ | -d " | ||
+ | -d ' | ||
+ | --data-urlencode " | ||
+ | ${FOLDER: | ||
+ | " | ||
+ | |||
+ | KEY_ID=' | ||
+ | else | ||
+ | local FILE_SIZE | ||
+ | FILE_SIZE=$(get_filesize " | ||
+ | |||
+ | JSON=$(curl_with_log -F " | ||
+ | --header " | ||
+ | --header " | ||
+ | " | ||
+ | |||
+ | KEY_ID=' | ||
+ | fi | ||
+ | |||
+ | # Check for errors | ||
+ | RES=$(parse_json result <<<" | ||
+ | if [ " | ||
+ | local NUM MSG | ||
+ | NUM=$(parse_json_quiet error <<<" | ||
+ | MSG=$(parse_json_quiet message <<<" | ||
+ | log_error " | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | UPLOAD_KEY=$(parse_json " | ||
+ | log_debug " | ||
+ | QUICK_KEY='' | ||
+ | |||
+ | # Wait for upload to finish | ||
+ | if match_remote_url " | ||
+ | [ -n " | ||
+ | else | ||
+ | for N in 3 3 2 2 2; do | ||
+ | wait $N seconds || return | ||
+ | |||
+ | JSON=$(curl --get -d " | ||
+ | -d ' | ||
+ | " | ||
+ | |||
+ | RES=$(parse_json result <<<" | ||
+ | if [ " | ||
+ | log_error "FIXME ' | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | # No more requests for this key | ||
+ | RES=$(parse_json status <<<" | ||
+ | if [ " | ||
+ | QUICK_KEY=$(parse_json quickkey <<<" | ||
+ | break | ||
+ | fi | ||
+ | done | ||
+ | fi | ||
+ | |||
+ | if [ -z " | ||
+ | local MSG ERR | ||
+ | MSG=$(parse_json_quiet description <<<" | ||
+ | ERR=$(parse_json_quiet fileerror <<<" | ||
+ | log_error "Bad status $RES: ' | ||
+ | log_debug " | ||
+ | return $ERR_FATAL | ||
+ | fi | ||
+ | |||
+ | if [ -n " | ||
+ | JSON=$(curl -d " | ||
+ | -d " | ||
+ | ${DESCRIPTION: | ||
+ | ${PRIVATE_FILE: | ||
+ | " | ||
+ | |||
+ | RES=$(parse_json result <<<" | ||
+ | if [ " | ||
+ | log_error 'Could not set description/ | ||
+ | fi | ||
+ | fi | ||
+ | |||
+ | # Note: Making a file private removes its password... | ||
+ | if [ -n " | ||
+ | JSON=$(curl -d " | ||
+ | -d " | ||
+ | -d " | ||
+ | " | ||
+ | |||
+ | RES=$(parse_json result <<<" | ||
+ | if [ " | ||
+ | log_error 'Could not set password.' | ||
+ | fi | ||
+ | fi | ||
+ | |||
+ | echo " | ||
+ | } | ||
+ | |||
+ | </ | ||
+ | === Analyser code === | ||
+ | * Step1: Get session token using $BASE_URL/ | ||
+ | SESSION_TOKEN=$(mediafire_api_get_session_token " | ||
+ | </ | ||
+ | * Step2: Check for duplicate name using $BASE_URL/ | ||
+ | JSON=$(curl --get -d " | ||
+ | -d ' | ||
+ | -d ' | ||
+ | | ||
+ | " | ||
+ | </ | ||
+ | * Step3: Start upload | ||
+ | * using remote upload $BASE_URL/ | ||
+ | if match_remote_url " | ||
+ | JSON=$(curl -d " | ||
+ | -d " | ||
+ | -d ' | ||
+ | --data-urlencode " | ||
+ | ${FOLDER: | ||
+ | " | ||
+ | </ | ||
+ | * Or using upload file with $BASE_URL/ | ||
+ | else | ||
+ | local FILE_SIZE | ||
+ | FILE_SIZE=$(get_filesize " | ||
+ | |||
+ | JSON=$(curl_with_log -F " | ||
+ | --header " | ||
+ | --header " | ||
+ | " | ||
+ | |||
+ | KEY_ID=' | ||
+ | fi | ||
+ | </ |