Compare commits

...

51 Commits

Author SHA1 Message Date
flightlevel
8ce3b35595 Updater: Remove unused legacy files 2018-08-18 18:22:28 +10:00
flightlevel
69cb493144 Reorganise solution 2018-08-18 18:15:45 +10:00
flightlevel
e850a5315a Package update 2018-08-18 18:08:14 +10:00
flightlevel
d054bfce87 Remove references to NET452, no longer supported 2018-08-18 17:37:46 +10:00
flightlevel
7a0bafe528 Remove Engine
Dead code since upgrade to Jackett.Server
2018-08-18 17:35:00 +10:00
flightlevel
81b40df6ee Remove Jackett Owin web server
Dead code since upgrade to Jackett.Server
2018-08-18 17:31:28 +10:00
flightlevel
795c896abe Tests: Remove dependency on Jackett.dll 2018-08-18 17:27:35 +10:00
flightlevel
524a0c7885 Remove IsRunningLegacyOwin check
Dead code since upgrade to Jackett.Server
2018-08-18 17:09:19 +10:00
flightlevel
0ddaa3bef4 Remove CurlSharp
Dead code since upgrade to Jackett.Server
2018-08-18 16:46:31 +10:00
flightlevel
e180b4bfc2 Remove references to CurlSharp
Dead code since upgrade to Jackett.Server
2018-08-18 16:44:58 +10:00
flightlevel
16c9e95ee2 Remove Jackett.Console
Dead code since upgrade to Jackett.Server
2018-08-18 16:17:58 +10:00
Garfield69
e5be938c54 limetorrents: domain change to .me fix #3627 2018-08-18 11:53:04 +12:00
Jorman
5f81fa51fc Ilcorsaroblu: Update (#3629)
Changed andmatch filter
All regex are case insensitive
Some minor fix
2018-08-18 11:42:12 +12:00
Jorman
70014485a0 Ilcorsaronero: Update (#3630)
Changed the andmatch filter
All regex are case insensitive
Now all the search are made with the site advanced search up to page 10
2018-08-18 11:41:18 +12:00
Jorman
6f1f3434cc Shareisland: Update (#3631)
Changed the andmatch filter
All regex are case insensitivi
A lot of categories changed (noticed for coincidence, during some test on regex)
Some minor fix
2018-08-18 11:40:23 +12:00
Jorman
6323dc022f Girotorrent: Update (#3628)
Changed the andmatch filter
All regex are case insensitive
Some minor fix
2018-08-18 11:38:02 +12:00
Lucas
522d7eeb4c Fix YGGtorrent URL (#3615) 2018-08-17 21:58:11 +10:00
koper89
6065e1c576 Add red star torrent. (#3616) 2018-08-17 21:57:51 +10:00
koper89
eff17d8fe2 Added 720pier.ru (#3620)
* Add 720pier

* - Added size and date parsing.

* - Added categories.
2018-08-17 21:57:04 +10:00
flightlevel
2307f6d2a5 SppedCd: Update TVSD 2018-08-16 19:50:30 +10:00
koper89
2e95240d34 Add BTGigs polish tracker. (#3607) 2018-08-16 19:43:34 +10:00
koper89
95df5228c6 Added HQSource polish tracker support (#3606) 2018-08-16 19:43:22 +10:00
flightlevel
bc104e356c Tidy up build script 2018-08-16 19:41:42 +10:00
kaso17
ad143ce94f reverse proxy: use X-Forwarded-Host 2018-08-15 09:00:54 +02:00
flightlevel
05520f23d1 Idope: Add legacy links 2018-08-14 20:18:50 +10:00
tvebax
1884073c21 Update idope.yml (#3592)
Domain name changed.

Ref:
$ dig idope.cc @8.8.8.8 +short
104.27.131.126
104.27.130.126

$ dig idope.se @8.8.8.8
<snip>
;; OPT PSEUDOSECTION:
; EDNS: version: 0, flags:; udp: 512
;; QUESTION SECTION:
;idope.se.			IN	A
<snip>

https://www.reddit.com/r/Piracy/comments/96ajed/whats_with_idope_these_days/e407w1z
2018-08-14 20:15:51 +10:00
flightlevel
7d759917e6 Package updates 2018-08-14 20:02:54 +10:00
flightlevel
eccf4dc22c Jackett service: remove unused references 2018-08-14 19:59:32 +10:00
flightlevel
a752a39230 Add ability to use appsettings.json
https://github.com/Jackett/Jackett/issues/3583
2018-08-14 19:58:11 +10:00
hallengreenn
34dd2981b3 Update Nordicbits.cs (#3580)
Added porn to list of categorys and fixed bluray remux catagoryID.
2018-08-13 20:00:47 +10:00
kaso17
0adb54f4b2 kestrel: attempt to fix XForwardedProto issues 2018-08-12 12:18:14 +02:00
hallengreenn
2e77226f0c Update nordicb.org - categoryList for search (#3568)
I made some copy-paste errors, while building the categoryList.
Because of this - the seach string for Apps & Ebooks won't be builded correctly.
Fixed
2018-08-12 16:50:43 +10:00
flightlevel
31ae08544f Remove System.Runtime.InteropServices.RuntimeInformation.dll from Mono build
https://github.com/Jackett/Jackett/issues/3547
2018-08-12 16:49:52 +10:00
Garfield69
3da6e4ca1b torrent9: domain change. fix #3569 2018-08-12 08:06:21 +12:00
halali
c4cd17ce2d Update cztorrent.yml (#3551)
* Update cztorrent.yml

* Update cztorrent.yml
2018-08-11 12:53:35 +10:00
hallengreenn
ee18368192 Add support for Nordicbits.org/nordicb.org (#3552)
* Create Nordicbits.cs

Add support for Nordicbits / DK Private Tracker

* Create ConfigurationDataNordicbits.cs

Create Nordicbits.cs

Add support for Nordicbits / DK Private Tracker

* Update README.md

Added a new tracker Nordicbits

* Update Nordicbits.cs

The self-tester didn't work, because of the trackers use of 'Today' and 'Yesterday' in releases. Where other releases are "MMM dd yyyy".
This is fixed now.

* Update Nordicbits.cs

Search string wasn't build proberly, due to the catagory of nordicbits.
The images, which we use to determin which catagory the release is from goes like this - 
cat=5
But when we need to search under that catagory and have more than one, it should go like this -
cats2[]=5

Changed the categorieslist, by using replace and changed the query string to match the original search string.

* Update Nordicbits.cs

Should solve the bug - The string was not recognized as a valid DateTime. There is an unknown word starting at index 0.

* Update Nordicbits.cs

Tracker don't support imdb, therefore problems with handling of couchpotato, radarr or similar.
Fixed by setting imdb to false;
2018-08-11 12:53:06 +10:00
kaso17
09a7950c1d x264: add Login Type note 2018-08-06 13:14:44 +02:00
Chris Johnson
847688bae8 Updated Readme to include linux Ansible installations (#3516)
Added reference to a RHEL and Ubuntu ansible galaxy role for installing via Ansible.
2018-08-05 18:48:18 +02:00
Jorman
7fde427731 girotorrent: Update regex for better title parsing... (#3529)
Italian release suck, you can find S01E01 or 01x01 or even S01 E01.
So removed S and E from search (this bring in extra results) then try to parse and reconstruct the results
Hope for now is enough to have a better search, at least on Italian tracker, for the external tracker (where are present Italian release too), only god can tell ... Sonarr will parse the result during a rss sync, but during a search I think no result will present ... There are 2 possible solution
1. Find and eliminate all release that don't respect the standard
2. Make Jackett / Sonarr / Radarr a little smartest and include a kind of regional substitution during a searching phase
2018-08-05 18:45:42 +02:00
Jorman
7319078a5d shareisland: Update regex for better title parsing... (#3530)
Italian release suck, you can find S01E01 or 01x01 or even S01 E01.
So removed S and E from search (this bring in extra results) then try to parse and reconstruct the results
Hope for now is enough to have a better search, at least on Italian tracker, for the external tracker (where are present Italian release too), only god can tell ... Sonarr will parse the result during a rss sync, but during a search I think no result will present ... There are 2 possible solution
1. Find and eliminate all release that don't respect the standard
2. Make Jackett / Sonarr / Radarr a little smartest and include a kind of regional substitution during a searching phase
2018-08-05 18:44:55 +02:00
Jorman
495afb91e9 ilcorsaroblu: Update regex for better title parsing... (#3532)
Italian release suck, you can find S01E01 or 01x01 or even S01 E01.
So removed S and E from search (this bring in extra results) then try to parse and reconstruct the results
Hope for now is enough to have a better search, at least on Italian tracker, for the external tracker (where are present Italian release too), only god can tell ... Sonarr will parse the result during a rss sync, but during a search I think no result will present ... There are 2 possible solution
1. Find and eliminate all release that don't respect the standard
2. Make Jackett / Sonarr / Radarr a little smartest and include a kind of regional substitution during a searching phase
2018-08-05 18:42:17 +02:00
Jorman
aa3fa8717f ilcorsaronero: Update regex for better title parsing... (#3531)
Italian release suck, you can find S01E01 or 01x01 or even S01 E01.
So removed S and E from search (this bring in extra results) then try to parse and reconstruct the results
Hope for now is enough to have a better search, at least on Italian tracker, for the external tracker (where are present Italian release too), only god can tell ... Sonarr will parse the result during a rss sync, but during a search I think no result will present ... There are 2 possible solution
1. Find and eliminate all release that don't respect the standard
2. Make Jackett / Sonarr / Radarr a little smartest and include a kind of regional substitution during a searching phase
2018-08-05 18:41:20 +02:00
flightlevel
052e382d93 Assign webroot for static files
Removes the need for PhysicalFileProvider
2018-08-05 15:31:54 +10:00
flightlevel
db39b6afd9 Prevent cookie expiring after 20 minutes, now sessional 2018-08-05 15:19:31 +10:00
HDVinnie
e05efaeb1d Update blutopia.yml (#3528)
- closes #3523
2018-08-05 15:15:54 +10:00
kaso17
b94501f054 adjust content root path (#3527) 2018-08-05 15:15:23 +10:00
flightlevel
f00d8e192a Set the content root to the application folder
https://github.com/Jackett/Jackett/issues/3522
2018-08-04 16:49:26 +10:00
eric@skrobs
9ca4600eab yggtorrent: fix URL (#3515)
Fix YGGTorrent url
2018-08-03 06:15:08 +02:00
flightlevel
5e8ebd8579 Build script: Use Kestrel web server on Mono 2018-07-31 20:19:25 +10:00
flightlevel
748881ef70 Kestrel: accept imdbid+q
Copy of 0d6830b0aa
2018-07-31 20:14:02 +10:00
kaso17
42e6600c6a NextTorrent: removed (dead) 2018-07-31 11:00:34 +02:00
121 changed files with 1907 additions and 12922 deletions

View File

@@ -43,7 +43,6 @@ Developer note: The software implements the [Torznab](https://github.com/Sonarr/
* LimeTorrents
* MagnetDL
* MejorTorrent <!-- maintained by ivandelabeldad -->
* NextTorrent
* Newpct (aka: tvsinpagar, descargas2020, torrentlocura, torrentrapid, etc)
* Nyaa.si
* Nyaa-Pantsu
@@ -91,6 +90,7 @@ Developer note: The software implements the [Torznab](https://github.com/Sonarr/
### Supported Private Trackers
* 2 Fast 4 You
* 3D Torrents
* 720pier
* Abnormal
* Acid-Lounge
* AlphaRatio
@@ -126,6 +126,7 @@ Developer note: The software implements the [Torznab](https://github.com/Sonarr/
* Brasil Tracker
* BroadcastTheNet
* BrokenStones
* BTGigs
* BTNext
* BTXpress
* Carpathians
@@ -184,6 +185,7 @@ Developer note: The software implements the [Torznab](https://github.com/Sonarr/
* HDTorrents.it
* Hebits
* Hon3y HD
* HQSource
* Hyperay
* ICE Torrent
* I Love Classics
@@ -208,6 +210,7 @@ Developer note: The software implements the [Torznab](https://github.com/Sonarr/
* NCore
* Nebulance
* New Real World
* NordicBits
* Norbits <!-- added by DiseaseNO but no longer maintained? -->
* notwhat.cd
* Ourbits
@@ -224,6 +227,7 @@ Developer note: The software implements the [Torznab](https://github.com/Sonarr/
* PuntoTorrent
* Racing4Everyone (R4E)
* Redacted (PassTheHeadphones)
* Red Star Torrent
* RevolutionTT
* RGU
* RoDVD
@@ -336,6 +340,12 @@ If you want to run it with a user without a /home directory you need to add `Env
Mono must be compiled with the Roslyn compiler (default), using MCS will cause "An error has occurred." errors (See https://github.com/Jackett/Jackett/issues/2704).
### Installation on Linux via Ansible
On a RHEL/Centos 7 system: [linuxhq.jackett](https://galaxy.ansible.com/linuxhq/jackett)
On an Ubuntu 16 system: [chrisjohnson00.jackett](https://galaxy.ansible.com/chrisjohnson00/jackett)
## Installation on macOS
### Prerequisites

View File

@@ -1,4 +1,4 @@
version: 0.9.{build}
version: 0.10.{build}
skip_tags: true
image: Visual Studio 2017
configuration: Release

View File

@@ -17,9 +17,6 @@ var workingDir = MakeAbsolute(Directory("./"));
var artifactsDirName = "Artifacts";
var testResultsDirName = "TestResults";
var windowsBuildFullFramework = "./BuildOutput/FullFramework/Windows";
var monoBuildFullFramework = "./BuildOutput/FullFramework/Mono";
//////////////////////////////////////////////////////////////////////
// TASKS
//////////////////////////////////////////////////////////////////////
@@ -45,17 +42,12 @@ Task("Clean")
Information("Clean completed");
});
Task("Restore-NuGet-Packages")
Task("Build-Full-Framework")
.IsDependentOn("Clean")
.Does(() =>
{
NuGetRestore("./src/Jackett.sln");
});
Task("Build")
.IsDependentOn("Restore-NuGet-Packages")
.Does(() =>
{
var buildSettings = new MSBuildSettings()
.SetConfiguration(configuration)
.UseToolVersion(MSBuildToolVersion.VS2017);
@@ -64,7 +56,7 @@ Task("Build")
});
Task("Run-Unit-Tests")
.IsDependentOn("Build")
.IsDependentOn("Build-Full-Framework")
.Does(() =>
{
CreateDirectory("./" + testResultsDirName);
@@ -81,29 +73,8 @@ Task("Run-Unit-Tests")
}
});
Task("Copy-Files-Full-Framework")
.IsDependentOn("Run-Unit-Tests")
.Does(() =>
{
var windowsOutput = windowsBuildFullFramework + "/Jackett";
CopyDirectory("./src/Jackett.Console/bin/" + configuration, windowsOutput);
CopyFiles("./src/Jackett.Updater/bin/" + configuration + "/net452" + "/JackettUpdater.*", windowsOutput); //builds against multiple frameworks
CopyFiles("./Upstart.config", windowsOutput);
CopyFiles("./LICENSE", windowsOutput);
CopyFiles("./README.md", windowsOutput);
var monoOutput = monoBuildFullFramework + "/Jackett";
CopyDirectory(windowsBuildFullFramework, monoBuildFullFramework);
DeleteFiles(monoOutput + "/JackettService.*");
DeleteFiles(monoOutput + "/JackettTray.*");
Information("Full framework file copy completed");
});
Task("Check-Packaging-Platform")
.IsDependentOn("Copy-Files-Full-Framework")
.IsDependentOn("Run-Unit-Tests")
.Does(() =>
{
if (IsRunningOnWindows())
@@ -117,40 +88,9 @@ Task("Check-Packaging-Platform")
}
});
Task("Package-Files-Full-Framework-Mono")
Task("Package-Windows-Full-Framework")
.IsDependentOn("Check-Packaging-Platform")
.Does(() =>
{
Gzip(monoBuildFullFramework, $"./{artifactsDirName}", "Jackett", "Jackett.Binaries.Mono.tar.gz");
Information(@"Full Framework Mono Binaries Gzip Completed");
});
Task("Package-Full-Framework")
.IsDependentOn("Package-Files-Full-Framework-Mono")
.Does(() =>
{
Information("Full Framework Packaging Completed");
});
Task("Kestrel-Full-Framework")
.IsDependentOn("Package-Full-Framework")
.Does(() =>
{
CleanDirectories("./src/**/obj");
CleanDirectories("./src/**/bin");
NuGetRestore("./src/Jackett.sln");
var buildSettings = new MSBuildSettings()
.SetConfiguration(configuration)
.UseToolVersion(MSBuildToolVersion.VS2017);
MSBuild("./src/Jackett.sln", buildSettings);
});
Task("Experimental-Kestrel-Windows-Full-Framework")
.IsDependentOn("Kestrel-Full-Framework")
.Does(() =>
{
string serverProjectPath = "./src/Jackett.Server/Jackett.Server.csproj";
string buildOutputPath = "./BuildOutput/Experimental/net461/win7-x86/Jackett";
@@ -178,8 +118,8 @@ Task("Experimental-Kestrel-Windows-Full-Framework")
InnoSetup("./Installer.iss", settings);
});
Task("Experimental-Kestrel-Mono-Full-Framework")
.IsDependentOn("Kestrel-Full-Framework")
Task("Package-Mono-Full-Framework")
.IsDependentOn("Check-Packaging-Platform")
.Does(() =>
{
string serverProjectPath = "./src/Jackett.Server/Jackett.Server.csproj";
@@ -199,11 +139,17 @@ Task("Experimental-Kestrel-Mono-Full-Framework")
var configFile = File(buildOutputPath + "/JackettConsole.exe.config");
XmlPoke(configFile, "configuration/runtime/*[name()='assemblyBinding']/*[name()='dependentAssembly']/*[name()='assemblyIdentity'][@name='System.Net.Http']/../*[name()='bindingRedirect']/@newVersion", "4.0.0.0");
Gzip("./BuildOutput/Experimental/net461/linux-x64", $"./{artifactsDirName}", "Jackett", "Experimental.Jackett.Binaries.Mono.tar.gz");
//Mono on FreeBSD doesn't like the bundled System.Runtime.InteropServices.RuntimeInformation
//https://github.com/dotnet/corefx/issues/23989
//https://github.com/Jackett/Jackett/issues/3547
DeleteFile(buildOutputPath + "/System.Runtime.InteropServices.RuntimeInformation.dll");
Gzip("./BuildOutput/Experimental/net461/linux-x64", $"./{artifactsDirName}", "Jackett", "Jackett.Binaries.Mono.tar.gz");
});
Task("Experimental-DotNetCore")
.IsDependentOn("Kestrel-Full-Framework")
.IsDependentOn("Check-Packaging-Platform")
.Does(() =>
{
string serverProjectPath = "./src/Jackett.Server/Jackett.Server.csproj";
@@ -217,18 +163,17 @@ Task("Experimental-DotNetCore")
Gzip("./BuildOutput/Experimental/netcoreapp2.1/linux-x64", $"./{artifactsDirName}", "Jackett", "Experimental.netcoreapp.linux-x64.tar.gz");
});
Task("Experimental")
.IsDependentOn("Experimental-Kestrel-Windows-Full-Framework")
.IsDependentOn("Experimental-Kestrel-Mono-Full-Framework")
Task("Package")
.IsDependentOn("Package-Windows-Full-Framework")
.IsDependentOn("Package-Mono-Full-Framework")
//.IsDependentOn("Experimental-DotNetCore")
.Does(() =>
{
Information("Experimental builds completed");
Information("Packaging completed");
});
Task("Appveyor-Push-Artifacts")
.IsDependentOn("Package-Full-Framework")
.IsDependentOn("Experimental")
.IsDependentOn("Package")
.Does(() =>
{
if (AppVeyor.IsRunningOnAppVeyor)

View File

@@ -1,244 +0,0 @@
using System;
using System.Runtime.InteropServices;
using CurlSharp.Enums;
namespace CurlSharp.Callbacks
{
/// <summary>
/// Called when cURL has debug information for the client.
/// </summary>
/// <remarks>
/// For usage, see the sample <c>Upload.cs</c>.
/// Arguments passed to the recipient include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>infoType</term>
/// <description>
/// Type of debug information, see
/// <see cref="CurlInfoType" />.
/// </description>
/// </item>
/// <item>
/// <term>message</term>
/// <description>Debug information as a string.</description>
/// </item>
/// <item>
/// <term>size</term>
/// <description>The size in bytes.</description>
/// </item>
/// <item>
/// <term>extraData</term>
/// <description>Client-provided extra data.</description>
/// </item>
/// </list>
/// </remarks>
public delegate void CurlDebugCallback(CurlInfoType infoType, String message, int size, Object extraData);
/// <summary>
/// Called when cURL has header data for the client.
/// </summary>
/// <remarks>
/// For usage, see the sample <c>Headers.cs</c>.
/// Arguments passed to the recipient include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>buf</term>
/// <description>Header data from cURL to the client.</description>
/// </item>
/// <item>
/// <term>size</term>
/// <description>Size of a character, in bytes.</description>
/// </item>
/// <item>
/// <term>nmemb</term>
/// <description>Number of characters.</description>
/// </item>
/// <item>
/// <term>extraData</term>
/// <description>Client-provided extra data.</description>
/// </item>
/// </list>
/// Your implementation should return the number of bytes (not
/// characters) processed. Usually this is <c>size * nmemb</c>.
/// Return -1 to abort the transfer.
/// </remarks>
public delegate int CurlHeaderCallback(byte[] buf, int size, int nmemb, Object extraData);
/// <summary>
/// Called when cURL needs for the client to perform an
/// IOCTL operation. An example might be when an FTP
/// upload requires rewinding of the input file to deal
/// with a resend occasioned by an error.
/// </summary>
/// <remarks>
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>cmd</term>
/// <description>
/// A <see cref="CurlIoCommand" />; for now, only
/// <c>RestartRead</c> should be passed.
/// </description>
/// </item>
/// <item>
/// <term>extraData</term>
/// <description>
/// Client-provided extra data; in the
/// case of an FTP upload, it might be a
/// <c>FileStream</c> object.
/// </description>
/// </item>
/// </list>
/// Your implementation should return a <see cref="CurlIoError" />,
/// which should be <see cref="CurlIoError.Ok" /> if everything
/// is okay.
/// </remarks>
public delegate CurlIoError CurlIoctlCallback(CurlIoCommand cmd, Object extraData);
/// <summary>
/// Called when cURL wants to report progress.
/// </summary>
/// <remarks>
/// For usage, see the sample <c>Upload.cs</c>.
/// Arguments passed to the recipient include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>extraData</term>
/// <description>Client-provided extra data.</description>
/// </item>
/// <item>
/// <term>dlTotal</term>
/// <description>Number of bytes to download.</description>
/// </item>
/// <item>
/// <term>dlNow</term>
/// <description>Number of bytes downloaded so far.</description>
/// </item>
/// <item>
/// <term>ulTotal</term>
/// <description>Number of bytes to upload.</description>
/// </item>
/// <item>
/// <term>ulNow</term>
/// <description>Number of bytes uploaded so far.</description>
/// </item>
/// </list>
/// Your implementation should return 0 to continue, or a non-zero
/// value to abort the transfer.
/// </remarks>
public delegate int CurlProgressCallback(Object extraData, double dlTotal, double dlNow,
double ulTotal, double ulNow);
/// <summary>
/// Called when cURL wants to read data from the client.
/// </summary>
/// <remarks>
/// For usage, see the sample <c>Upload.cs</c>.
/// Arguments passed to the recipient include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>buf</term>
/// <description>
/// Buffer into which your client should write data
/// for cURL.
/// </description>
/// </item>
/// <item>
/// <term>size</term>
/// <description>Size of a character, usually 1.</description>
/// </item>
/// <item>
/// <term>nmemb</term>
/// <description>Number of characters.</description>
/// </item>
/// <item>
/// <term>extraData</term>
/// <description>Client-provided extra data.</description>
/// </item>
/// </list>
/// Your implementation should return the number of bytes (not
/// characters) written to <c>buf</c>. Return 0 to abort the transfer.
/// </remarks>
public delegate int CurlReadCallback([Out] byte[] buf, int size, int nmemb, Object extraData);
/// <summary>
/// Called when cURL wants to report an Ssl event.
/// </summary>
/// <remarks>
/// For usage, see the sample <c>SSLGet.cs</c>.
/// Arguments passed to the recipient include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>ctx</term>
/// <description>
/// An <see cref="CurlSslContext" /> object that wraps an
/// OpenSSL <c>SSL_CTX</c> pointer.
/// </description>
/// </item>
/// <item>
/// <term>extraData</term>
/// <description>Client-provided extra data.</description>
/// </item>
/// </list>
/// Your implementation should return a <see cref="CurlCode" />,
/// which should be <see cref="CurlCode.Ok" /> if everything
/// is okay.
/// </remarks>
public delegate CurlCode CurlSslContextCallback(CurlSslContext ctx, Object extraData);
/// <summary>
/// Called when cURL has data for the client.
/// </summary>
/// <remarks>
/// For usage, see the example <c>EasyGet.cs</c>.
/// Arguments passed to the delegate implementation include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <description>Description</description>
/// </listheader>
/// <item>
/// <term>buf</term>
/// <description>Data cURL is providing to the client.</description>
/// </item>
/// <item>
/// <term>size</term>
/// <description>Size of a character, usually 1.</description>
/// </item>
/// <item>
/// <term>nmemb</term>
/// <description>Number of characters.</description>
/// </item>
/// <item>
/// <term>extraData</term>
/// <description>Client-provided extra data.</description>
/// </item>
/// </list>
/// Your implementation should return the number of bytes (not
/// characters) processed. Return 0 to abort the transfer.
/// </remarks>
public delegate int CurlWriteCallback(byte[] buf, int size, int nmemb, Object extraData);
}

View File

@@ -1,76 +0,0 @@
using System;
using CurlSharp.Enums;
namespace CurlSharp.Callbacks
{
/// <summary>
/// Called when <c>cURL</c> wants to lock a shared resource.
/// </summary>
/// <remarks>
/// For a usage example, refer to the <c>ShareDemo.cs</c> sample.
/// Arguments passed to your delegate implementation include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <term>Description</term>
/// </listheader>
/// <item>
/// <term>data</term>
/// <term>
/// Type of data to lock; one of the values in the
/// <see cref="CurlLockData" /> enumeration.
/// </term>
/// </item>
/// <item>
/// <term>access</term>
/// <term>
/// Lock access requested; one of the values in the
/// <see cref="CurlLockAccess" /> enumeration.
/// </term>
/// </item>
/// <item>
/// <term>userData</term>
/// <term>
/// Client-provided data that is not touched internally by
/// <c>cURL</c>. This is set via
/// <see cref="CurlShareOption.UserData" /> when calling the
/// <see cref="CurlShare.SetOpt" /> member of the <see cref="CurlShare" />
/// class.
/// </term>
/// </item>
/// </list>
/// </remarks>
public delegate void CurlShareLockCallback(CurlLockData data, CurlLockAccess access, Object userData);
/// <summary>
/// Called when <c>cURL</c> wants to unlock a shared resource.
/// </summary>
/// <remarks>
/// For a usage example, refer to the <c>ShareDemo.cs</c> sample.
/// Arguments passed to your delegate implementation include:
/// <list type="table">
/// <listheader>
/// <term>Argument</term>
/// <term>Description</term>
/// </listheader>
/// <item>
/// <term>data</term>
/// <term>
/// Type of data to unlock; one of the values in the
/// <see cref="CurlLockData" /> enumeration.
/// </term>
/// </item>
/// <item>
/// <term>userData</term>
/// <term>
/// Client-provided data that is not touched internally by
/// <c>cURL</c>. This is set via
/// <see cref="CurlShareOption.UserData" /> when calling the
/// <see cref="CurlShare.SetOpt" /> member of the <see cref="CurlShare" />
/// class.
/// </term>
/// </item>
/// </list>
/// </remarks>
public delegate void CurlShareUnlockCallback(CurlLockData data, Object userData);
}

View File

@@ -1,129 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using System.Runtime.InteropServices;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// Top-level class for initialization and cleanup.
/// </summary>
/// <remarks>
/// It also implements static methods for capabilities that don't
/// logically belong in a class.
/// </remarks>
public static class Curl
{
// for state management
private static CurlCode _initCode;
/// <summary>
/// Class constructor - initialize global status.
/// </summary>
static Curl()
{
_initCode = CurlCode.FailedInit;
}
// hidden instance stuff
/// <summary>
/// Get the underlying cURL version as a string, example "7.12.2".
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// Thrown if cURL isn't properly initialized.
/// </exception>
public static string Version
{
get
{
EnsureCurl();
return Marshal.PtrToStringAnsi(NativeMethods.curl_version());
}
}
/// <summary>
/// Process-wide initialization -- call only once per process.
/// </summary>
/// <param name="flags">
/// An or'd combination of
/// <see cref="CurlInitFlag" /> members.
/// </param>
/// <returns>
/// A <see cref="CurlCode" />, hopefully
/// <c>CurlCode.Ok</c>.
/// </returns>
public static CurlCode GlobalInit(CurlInitFlag flags)
{
_initCode = NativeMethods.curl_global_init((int) flags);
#if USE_LIBCURLSHIM
if (_initCode == CurlCode.Ok)
NativeMethods.curl_shim_initialize();
#endif
return _initCode;
}
/// <summary>
/// Process-wide cleanup -- call just before exiting process.
/// </summary>
/// <remarks>
/// While it's not necessary that your program call this method
/// before exiting, doing so will prevent leaks of native cURL resources.
/// </remarks>
public static void GlobalCleanup()
{
if (_initCode == CurlCode.Ok)
{
#if USE_LIBCURLSHIM
NativeMethods.curl_shim_cleanup();
#endif
NativeMethods.curl_global_cleanup();
_initCode = CurlCode.FailedInit;
}
}
/// <summary>
/// Get a <see cref="CurlVersionInfoData" /> object.
/// </summary>
/// <param name="ver">
/// Specify a <see cref="CurlVersion" />, such as
/// <c>CurlVersion.Now</c>.
/// </param>
/// <returns>A <see cref="CurlVersionInfoData" /> object.</returns>
/// <exception cref="System.InvalidOperationException">
/// Thrown if cURL isn't properly initialized.
/// </exception>
public static CurlVersionInfoData GetVersionInfo(CurlVersion ver)
{
EnsureCurl();
return new CurlVersionInfoData(ver);
}
/// <summary>
/// Called by other classes to ensure valid cURL state.
/// </summary>
internal static void EnsureCurl()
{
if (_initCode != CurlCode.Ok)
throw new InvalidOperationException("cURL not initialized");
}
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,386 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using System.Runtime.InteropServices;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// This trivial class wraps the internal <c>curl_forms</c> struct.
/// </summary>
public sealed class CurlForms
{
/// <summary>The <see cref="CurlFormOption" />.</summary>
public CurlFormOption Option;
/// <summary>Value for the option.</summary>
public object Value;
}
/// <summary>
/// Wraps a section of multipart form data to be submitted via the
/// <see cref="CurlOption.HttpPost" /> option in the
/// <see cref="CurlEasy.SetOpt" /> member of the <see cref="CurlEasy" /> class.
/// </summary>
public class CurlHttpMultiPartForm : IDisposable
{
// the two curlform pointers
private readonly IntPtr[] _pItems;
/// <summary>
/// Constructor
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// This is thrown
/// if <see cref="Curl" /> hasn't bee properly initialized.
/// </exception>
public CurlHttpMultiPartForm()
{
Curl.EnsureCurl();
_pItems = new IntPtr[2];
_pItems[0] = IntPtr.Zero;
_pItems[1] = IntPtr.Zero;
}
/// <summary>
/// Free unmanaged resources.
/// </summary>
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
/// <summary>
/// Destructor
/// </summary>
~CurlHttpMultiPartForm()
{
Dispose(false);
}
// for CurlEasy.SetOpt()
internal IntPtr GetHandle() => _pItems[0];
/// <summary>
/// Add a multi-part form section.
/// </summary>
/// <param name="args">
/// Argument list, as described in the remarks.
/// </param>
/// <returns>
/// A <see cref="CurlFormCode" />, hopefully
/// <c>CurlFormCode.Ok</c>.
/// </returns>
/// <remarks>
/// This is definitely the workhorse method for this class. It
/// should be called in roughly the same manner as
/// <c>curl_formadd()</c>, except you would omit the first two
/// <c>struct curl_httppost**</c> arguments (<c>firstitem</c> and
/// <c>lastitem</c>), which are wrapped in this class. So you should
/// pass arguments in the following sequence:
/// <para>
/// <c>
/// CurlHttpMultiPartForm.AddSection(option1, value1, ..., optionX, valueX,
/// CurlFormOption.End)
/// </c>
/// ;
/// </para>
/// <para>
/// For a complete list of possible options, see the documentation for
/// the <see cref="CurlFormOption" /> enumeration.
/// </para>
/// <note>
/// The pointer options (<c>PtrName</c>, etc.) make an
/// internal copy of the passed <c>byte</c> array. Therefore, any
/// changes you make to the client copy of this array AFTER calling
/// this method, won't be reflected internally with <c>cURL</c>. The
/// purpose of providing the pointer options is to support the
/// posting of non-string binary data.
/// </note>
/// </remarks>
public CurlFormCode AddSection(params object[] args)
{
var nCount = args.Length;
var nRealCount = nCount;
var retCode = CurlFormCode.Ok;
CurlForms[] aForms = null;
// one arg or even number of args is an error
if ((nCount == 1) || (nCount%2 == 0))
return CurlFormCode.Incomplete;
// ensure the last argument is End
var iCode = (CurlFormOption)
Convert.ToInt32(args.GetValue(nCount - 1));
if (iCode != CurlFormOption.End)
return CurlFormCode.Incomplete;
// walk through any passed arrays to get the true number of
// items and ensure the child arrays are properly (and not
// prematurely) terminated with End
for (var i = 0; i < nCount; i += 2)
{
iCode = (CurlFormOption) Convert.ToInt32(args.GetValue(i));
switch (iCode)
{
case CurlFormOption.Array:
{
aForms = args.GetValue(i + 1) as CurlForms[];
if (aForms == null)
return CurlFormCode.Incomplete;
var nFormsCount = aForms.Length;
for (var j = 0; j < nFormsCount; j++)
{
var pcf = aForms.GetValue(j) as CurlForms;
if (pcf == null)
return CurlFormCode.Incomplete;
if (j == nFormsCount - 1)
{
if (pcf.Option != CurlFormOption.End)
return CurlFormCode.Incomplete;
}
else
{
if (pcf.Option == CurlFormOption.End)
return CurlFormCode.Incomplete;
}
}
// -2 accounts for the fact that we're a) not
// including the item with End and b) not
// including Array in what we pass to cURL
nRealCount += 2*(nFormsCount - 2);
break;
}
}
}
// allocate the IntPtr array for the data
var aPointers = new IntPtr[nRealCount];
for (var i = 0; i < nRealCount - 1; i++)
aPointers[i] = IntPtr.Zero;
aPointers[nRealCount - 1] = (IntPtr) CurlFormOption.End;
// now we go through the args
aForms = null;
var formArrayPos = 0;
var argArrayPos = 0;
var ptrArrayPos = 0;
object obj = null;
while ((retCode == CurlFormCode.Ok) &&
(ptrArrayPos < nRealCount))
{
if (aForms != null)
{
var pcf = aForms.GetValue(formArrayPos++)
as CurlForms;
if (pcf == null)
{
retCode = CurlFormCode.UnknownOption;
break;
}
iCode = pcf.Option;
obj = pcf.Value;
}
else
{
iCode = (CurlFormOption) Convert.ToInt32(
args.GetValue(argArrayPos++));
obj = iCode == CurlFormOption.End
? null
: args.GetValue(argArrayPos++);
}
switch (iCode)
{
// handle byte-array pointer-related items
case CurlFormOption.PtrName:
case CurlFormOption.PtrContents:
case CurlFormOption.BufferPtr:
{
var bytes = obj as byte[];
if (bytes == null)
retCode = CurlFormCode.UnknownOption;
else
{
var nLen = bytes.Length;
var ptr = Marshal.AllocHGlobal(nLen);
if (ptr != IntPtr.Zero)
{
aPointers[ptrArrayPos++] = (IntPtr) iCode;
// copy bytes to unmanaged buffer
for (var j = 0; j < nLen; j++)
Marshal.WriteByte(ptr, bytes[j]);
aPointers[ptrArrayPos++] = ptr;
}
else
retCode = CurlFormCode.Memory;
}
break;
}
// length values
case CurlFormOption.NameLength:
case CurlFormOption.ContentsLength:
case CurlFormOption.BufferLength:
aPointers[ptrArrayPos++] = (IntPtr) iCode;
aPointers[ptrArrayPos++] = (IntPtr)
Convert.ToInt32(obj);
break;
// strings
case CurlFormOption.CopyName:
case CurlFormOption.CopyContents:
case CurlFormOption.FileContent:
case CurlFormOption.File:
case CurlFormOption.ContentType:
case CurlFormOption.Filename:
case CurlFormOption.Buffer:
{
aPointers[ptrArrayPos++] = (IntPtr) iCode;
var s = obj as string;
if (s == null)
retCode = CurlFormCode.UnknownOption;
else
{
var p = Marshal.StringToHGlobalAnsi(s);
if (p != IntPtr.Zero)
aPointers[ptrArrayPos++] = p;
else
retCode = CurlFormCode.Memory;
}
break;
}
// array case: already handled
case CurlFormOption.Array:
if (aForms != null)
retCode = CurlFormCode.IllegalArray;
else
{
aForms = obj as CurlForms[];
if (aForms == null)
retCode = CurlFormCode.UnknownOption;
}
break;
// slist
case CurlFormOption.ContentHeader:
{
aPointers[ptrArrayPos++] = (IntPtr) iCode;
var s = obj as CurlSlist;
if (s == null)
retCode = CurlFormCode.UnknownOption;
else
aPointers[ptrArrayPos++] = s.Handle;
break;
}
// erroneous stuff
case CurlFormOption.Nothing:
retCode = CurlFormCode.Incomplete;
break;
// end
case CurlFormOption.End:
if (aForms != null) // end of form
{
aForms = null;
formArrayPos = 0;
}
else
aPointers[ptrArrayPos++] = (IntPtr) iCode;
break;
// default is unknown
default:
retCode = CurlFormCode.UnknownOption;
break;
}
}
// ensure we didn't come up short on parameters
if (ptrArrayPos != nRealCount)
retCode = CurlFormCode.Incomplete;
// if we're OK here, call into curl
if (retCode == CurlFormCode.Ok)
{
#if USE_LIBCURLSHIM
retCode = (CurlFormCode) NativeMethods.curl_shim_formadd(_pItems, aPointers, nRealCount);
#else
retCode = (CurlFormCode) NativeMethods.curl_formadd(ref _pItems[0], ref _pItems[1],
(int) aPointers[0], aPointers[1],
(int) aPointers[2], aPointers[3],
(int) aPointers[4]);
#endif
}
// unmarshal native allocations
for (var i = 0; i < nRealCount - 1; i += 2)
{
iCode = (CurlFormOption) (int) aPointers[i];
switch (iCode)
{
case CurlFormOption.CopyName:
case CurlFormOption.CopyContents:
case CurlFormOption.FileContent:
case CurlFormOption.File:
case CurlFormOption.ContentType:
case CurlFormOption.Filename:
case CurlFormOption.Buffer:
// byte buffer cases
case CurlFormOption.PtrName:
case CurlFormOption.PtrContents:
case CurlFormOption.BufferPtr:
{
if (aPointers[i + 1] != IntPtr.Zero)
Marshal.FreeHGlobal(aPointers[i + 1]);
break;
}
default:
break;
}
}
return retCode;
}
private void Dispose(bool disposing)
{
lock (this)
{
if (disposing)
{
// clean up managed objects
}
// clean up native objects
if (_pItems[0] != IntPtr.Zero)
NativeMethods.curl_formfree(_pItems[0]);
_pItems[0] = IntPtr.Zero;
_pItems[1] = IntPtr.Zero;
}
}
}
}

View File

@@ -1,304 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using System.Collections;
using System.Runtime.InteropServices;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// Implements the <c>curl_multi_xxx</c> API.
/// </summary>
public class CurlMulti : IDisposable
{
// private members
private readonly Hashtable _htEasy;
private int _maxFd;
private CurlMultiInfo[] _multiInfo;
private bool _bGotMultiInfo;
#if USE_LIBCURLSHIM
private IntPtr _fdSets;
#else
private NativeMethods.fd_set _fd_read, _fd_write, _fd_except;
#endif
private IntPtr _pMulti;
private CurlPipelining _pipelining;
/// <summary>
/// Constructor
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// This is thrown
/// if <see cref="Curl" /> hasn't bee properly initialized.
/// </exception>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public CurlMulti()
{
Curl.EnsureCurl();
_pMulti = NativeMethods.curl_multi_init();
ensureHandle();
_maxFd = 0;
#if USE_LIBCURLSHIM
_fdSets = IntPtr.Zero;
_fdSets = NativeMethods.curl_shim_alloc_fd_sets();
#else
_fd_read = NativeMethods.fd_set.Create();
_fd_read = NativeMethods.fd_set.Create();
_fd_write = NativeMethods.fd_set.Create();
_fd_except = NativeMethods.fd_set.Create();
#endif
_multiInfo = null;
_bGotMultiInfo = false;
_htEasy = new Hashtable();
}
/// <summary>
/// Max file descriptor
/// </summary>
public int MaxFd => _maxFd;
/// <summary>
/// Cleanup unmanaged resources.
/// </summary>
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
/// <summary>
/// Destructor
/// </summary>
~CurlMulti()
{
Dispose(false);
}
private void Dispose(bool disposing)
{
lock (this)
{
// if (disposing) // managed member cleanup
// unmanaged cleanup
if (_pMulti != IntPtr.Zero)
{
NativeMethods.curl_multi_cleanup(_pMulti);
_pMulti = IntPtr.Zero;
}
#if USE_LIBCURLSHIM
if (_fdSets != IntPtr.Zero)
{
NativeMethods.curl_shim_free_fd_sets(_fdSets);
_fdSets = IntPtr.Zero;
}
#else
_fd_read.Cleanup();
_fd_write.Cleanup();
_fd_except.Cleanup();
#endif
}
}
private void ensureHandle()
{
if (_pMulti == IntPtr.Zero)
throw new NullReferenceException("No internal multi handle");
}
/// <summary>
/// Add an CurlEasy object.
/// </summary>
/// <param name="curlEasy">
/// <see cref="CurlEasy" /> object to add.
/// </param>
/// <returns>
/// A <see cref="CurlMultiCode" />, hopefully <c>CurlMultiCode.Ok</c>
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public CurlMultiCode AddHandle(CurlEasy curlEasy)
{
ensureHandle();
var p = curlEasy.Handle;
_htEasy.Add(p, curlEasy);
return NativeMethods.curl_multi_add_handle(_pMulti, p);
}
public CurlPipelining Pipelining
{
get { return _pipelining; }
set
{
ensureHandle();
_pipelining = value;
NativeMethods.curl_multi_setopt(_pMulti, CurlMultiOption.Pipelining, (long) value);
}
}
/// <summary>
/// Remove an CurlEasy object.
/// </summary>
/// <param name="curlEasy">
/// <see cref="CurlEasy" /> object to remove.
/// </param>
/// <returns>
/// A <see cref="CurlMultiCode" />, hopefully <c>CurlMultiCode.Ok</c>
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public CurlMultiCode RemoveHandle(CurlEasy curlEasy)
{
ensureHandle();
var p = curlEasy.Handle;
_htEasy.Remove(p);
return NativeMethods.curl_multi_remove_handle(_pMulti, curlEasy.Handle);
}
/// <summary>
/// Get a string description of an error code.
/// </summary>
/// <param name="errorNum">
/// The <see cref="CurlMultiCode" /> for which to obtain the error
/// string description.
/// </param>
/// <returns>The string description.</returns>
public string StrError(CurlMultiCode errorNum) => Marshal.PtrToStringAnsi(NativeMethods.curl_multi_strerror(errorNum));
/// <summary>
/// Read/write data to/from each CurlEasy object.
/// </summary>
/// <param name="runningObjects">
/// The number of <see cref="CurlEasy" /> objects still in process is
/// written by this function to this reference parameter.
/// </param>
/// <returns>
/// A <see cref="CurlMultiCode" />, hopefully <c>CurlMultiCode.Ok</c>
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public CurlMultiCode Perform(ref int runningObjects)
{
ensureHandle();
return NativeMethods.curl_multi_perform(_pMulti, ref runningObjects);
}
/// <summary>
/// Set internal file desriptor information before calling Select.
/// </summary>
/// <returns>
/// A <see cref="CurlMultiCode" />, hopefully <c>CurlMultiCode.Ok</c>
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public CurlMultiCode FdSet()
{
ensureHandle();
#if USE_LIBCURLSHIM
return NativeMethods.curl_shim_multi_fdset(_pMulti, _fdSets, ref _maxFd);
#else
NativeMethods.FD_ZERO(_fd_read);
NativeMethods.FD_ZERO(_fd_write);
NativeMethods.FD_ZERO(_fd_except);
return NativeMethods.curl_multi_fdset(_pMulti, ref _fd_read, ref _fd_write, ref _fd_except, ref _maxFd);
#endif
}
/// <summary>
/// Call <c>select()</c> on the CurlEasy objects.
/// </summary>
/// <param name="timeoutMillis">
/// The timeout for the internal <c>select()</c> call,
/// in milliseconds.
/// </param>
/// <returns>
/// Number or <see cref="CurlEasy" /> objects with pending reads.
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public int Select(int timeoutMillis)
{
ensureHandle();
#if USE_LIBCURLSHIM
return NativeMethods.curl_shim_select(_maxFd + 1, _fdSets, timeoutMillis);
#else
var timeout = NativeMethods.timeval.Create(timeoutMillis);
return NativeMethods.select(_maxFd + 1, ref _fd_read, ref _fd_write, ref _fd_except, ref timeout);
//return NativeMethods.select2(_maxFd + 1, _fd_read, _fd_write, _fd_except, timeout);
#endif
}
/// <summary>
/// Obtain status information for a CurlMulti transfer. Requires
/// CurlSharp be compiled with the libcurlshim helper.
/// </summary>
/// <returns>
/// An array of <see cref="CurlMultiInfo" /> objects, one for each
/// <see cref="CurlEasy" /> object child.
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if the native <c>CurlMulti</c> handle wasn't
/// created successfully.
/// </exception>
public CurlMultiInfo[] InfoRead()
{
if (_bGotMultiInfo)
return _multiInfo;
#if USE_LIBCURLSHIM
var nMsgs = 0;
var pInfo = NativeMethods.curl_shim_multi_info_read(_pMulti, ref nMsgs);
if (pInfo != IntPtr.Zero)
{
_multiInfo = new CurlMultiInfo[nMsgs];
for (var i = 0; i < nMsgs; i++)
{
var msg = (CurlMessage) Marshal.ReadInt32(pInfo, i*12);
var pEasy = Marshal.ReadIntPtr(pInfo, i*12 + 4);
var code = (CurlCode) Marshal.ReadInt32(pInfo, i*12 + 8);
_multiInfo[i] = new CurlMultiInfo(msg, (CurlEasy) _htEasy[pEasy], code);
}
NativeMethods.curl_shim_multi_info_free(pInfo);
}
_bGotMultiInfo = true;
#else
_multiInfo = null;
throw new NotImplementedException("CurlMulti.InfoRead()");
#endif
#pragma warning disable CS0162 // Unreachable code detected when not compiling with the shim
return _multiInfo;
#pragma warning restore CS0162 // Unreachable code detected when not compiling with the shim
}
}
}

View File

@@ -1,55 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// Wraps the <c>cURL</c> struct <c>CURLMsg</c>. This class provides
/// status information following a <see cref="CurlMulti" /> transfer.
/// </summary>
public sealed class CurlMultiInfo
{
// private members
internal CurlMultiInfo(CurlMessage msg, CurlEasy curlEasy, CurlCode result)
{
Msg = msg;
CurlEasyHandle = curlEasy;
Result = result;
}
/// <summary>
/// Get the status code from the <see cref="CurlMessage" /> enumeration.
/// </summary>
public CurlMessage Msg { get; }
/// <summary>
/// Get the <see cref="CurlEasy" /> object for this child.
/// </summary>
public CurlEasy CurlEasyHandle { get; }
/// <summary>
/// Get the return code for the transfer, as a
/// <see cref="CurlCode" />.
/// </summary>
public CurlCode Result { get; }
}
}

View File

@@ -1,242 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using System.Runtime.InteropServices;
using CurlSharp.Callbacks;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// This class provides an infrastructure for serializing access to data
/// shared by multiple <see cref="CurlEasy" /> objects, including cookie data
/// and Dns hosts. It implements the <c>curl_share_xxx</c> API.
/// </summary>
public class CurlShare : IDisposable
{
// private members
private GCHandle _hThis; // for handle extraction
#if USE_LIBCURLSHIM
private NativeMethods._ShimLockCallback _pDelLock; // lock delegate
private NativeMethods._ShimUnlockCallback _pDelUnlock; // unlock delegate
#endif
private IntPtr _pShare; // share handle
private IntPtr _ptrThis; // numeric handle
/// <summary>
/// Constructor
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// This is thrown
/// if <see cref="Curl" /> hasn't bee properly initialized.
/// </exception>
/// <exception cref="System.NullReferenceException">
/// This is thrown if
/// the native <c>share</c> handle wasn't created successfully.
/// </exception>
public CurlShare()
{
Curl.EnsureCurl();
_pShare = NativeMethods.curl_share_init();
EnsureHandle();
LockFunction = null;
UnlockFunction = null;
UserData = null;
installDelegates();
}
public object UserData { get; set; }
public CurlShareUnlockCallback UnlockFunction { get; set; }
public CurlShareLockCallback LockFunction { get; set; }
public CurlLockData Share
{
set { setShareOption(CurlShareOption.Share, value); }
}
public CurlLockData Unshare
{
set { setShareOption(CurlShareOption.Unshare, value); }
}
public CurlShareCode LastErrorCode { get; private set; }
public string LastErrorDescription { get; private set; }
/// <summary>
/// Cleanup unmanaged resources.
/// </summary>
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
/// <summary>
/// Destructor
/// </summary>
~CurlShare()
{
Dispose(false);
}
/// <summary>
/// Set options for this object.
/// </summary>
/// <param name="option">
/// One of the values in the <see cref="CurlShareOption" />
/// enumeration.
/// </param>
/// <param name="parameter">
/// An appropriate object based on the value passed in the
/// <c>option</c> argument. See <see cref="CurlShareOption" />
/// for more information about the appropriate parameter type.
/// </param>
/// <returns>
/// A <see cref="CurlShareCode" />, hopefully
/// <c>CurlShareCode.Ok</c>.
/// </returns>
/// <exception cref="System.NullReferenceException">
/// This is thrown if
/// the native <c>share</c> handle wasn't created successfully.
/// </exception>
public CurlShareCode SetOpt(CurlShareOption option, object parameter)
{
EnsureHandle();
var retCode = CurlShareCode.Ok;
switch (option)
{
case CurlShareOption.LockFunction:
var lf = parameter as CurlShareLockCallback;
if (lf == null)
return CurlShareCode.BadOption;
LockFunction = lf;
break;
case CurlShareOption.UnlockFunction:
var ulf = parameter as CurlShareUnlockCallback;
if (ulf == null)
return CurlShareCode.BadOption;
UnlockFunction = ulf;
break;
case CurlShareOption.Share:
case CurlShareOption.Unshare:
{
var opt = (CurlLockData) Convert.ToInt32(parameter);
retCode = setShareOption(option, opt);
break;
}
case CurlShareOption.UserData:
UserData = parameter;
break;
default:
retCode = CurlShareCode.BadOption;
break;
}
return retCode;
}
private void setLastError(CurlShareCode code, CurlShareOption opt)
{
if ((LastErrorCode == CurlShareCode.Ok) && (code != CurlShareCode.Ok))
{
LastErrorCode = code;
LastErrorDescription = $"Error: {StrError(code)} setting option {opt}";
}
}
private CurlShareCode setShareOption(CurlShareOption option, CurlLockData value)
{
var retCode = (value != CurlLockData.Cookie) && (value != CurlLockData.Dns)
? CurlShareCode.BadOption
: NativeMethods.curl_share_setopt(_pShare, option, (IntPtr) value);
setLastError(retCode, option);
return retCode;
}
/// <summary>
/// Return a String description of an error code.
/// </summary>
/// <param name="errorNum">
/// The <see cref="CurlShareCode" /> for which to obtain the error
/// string description.
/// </param>
/// <returns>The string description.</returns>
public string StrError(CurlShareCode errorNum)
=> Marshal.PtrToStringAnsi(NativeMethods.curl_share_strerror(errorNum));
private void Dispose(bool disposing)
{
lock (this)
{
// if (disposing) cleanup managed objects
if (_pShare != IntPtr.Zero)
{
#if USE_LIBCURLSHIM
NativeMethods.curl_shim_cleanup_share_delegates(_pShare);
#endif
NativeMethods.curl_share_cleanup(_pShare);
_hThis.Free();
_ptrThis = IntPtr.Zero;
_pShare = IntPtr.Zero;
}
}
}
internal IntPtr GetHandle() => _pShare;
private void EnsureHandle()
{
if (_pShare == IntPtr.Zero)
throw new NullReferenceException("No internal share handle");
}
private void installDelegates()
{
_hThis = GCHandle.Alloc(this);
_ptrThis = (IntPtr) _hThis;
#if USE_LIBCURLSHIM
_pDelLock = LockDelegate;
_pDelUnlock = UnlockDelegate;
NativeMethods.curl_shim_install_share_delegates(_pShare, _ptrThis, _pDelLock, _pDelUnlock);
#endif
}
internal static void LockDelegate(int data, int access, IntPtr userPtr)
{
var gch = (GCHandle) userPtr;
var share = (CurlShare) gch.Target;
share?.LockFunction?.Invoke((CurlLockData) data, (CurlLockAccess) access, share.UserData);
}
internal static void UnlockDelegate(int data, IntPtr userPtr)
{
var gch = (GCHandle) userPtr;
var share = (CurlShare) gch.Target;
share?.UnlockFunction?.Invoke((CurlLockData) data, share.UserData);
}
}
}

View File

@@ -1,12 +0,0 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFrameworks>net452;netstandard2.0</TargetFrameworks>
</PropertyGroup>
<PropertyGroup>
<AllowUnsafeBlocks>True</AllowUnsafeBlocks>
<PackageId>CurlSharp</PackageId>
<GeneratePackageOnBuild>false</GeneratePackageOnBuild>
</PropertyGroup>
</Project>

View File

@@ -1,141 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using System.Collections.Generic;
using System.Runtime.InteropServices;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// This class wraps a linked list of strings used in <c>cURL</c>. Use it
/// to build string lists where they're required, such as when calling
/// <see cref="CurlEasy.SetOpt" /> with <see cref="CurlOption.Quote" />
/// as the option.
/// </summary>
public class CurlSlist : IDisposable
{
/// <summary>
/// Constructor
/// </summary>
/// <exception cref="System.InvalidOperationException">
/// This is thrown
/// if <see cref="Curl" /> hasn't bee properly initialized.
/// </exception>
public CurlSlist()
{
Curl.EnsureCurl();
Handle = IntPtr.Zero;
}
public CurlSlist(IntPtr handle)
{
Handle = handle;
}
/// <summary>
/// Read-only copy of the strings stored in the SList
/// </summary>
public List<string> Strings
{
get
{
if (Handle == IntPtr.Zero)
return null;
var strings = new List<string>();
#if !USE_LIBCURLSHIM
var slist = new curl_slist();
Marshal.PtrToStructure(Handle, slist);
while (true)
{
strings.Add(slist.data);
if (slist.next != IntPtr.Zero)
Marshal.PtrToStructure(slist.next, slist);
else
break;
}
#endif
return strings;
}
}
internal IntPtr Handle { get; private set; }
/// <summary>
/// Free all internal strings.
/// </summary>
public void Dispose()
{
GC.SuppressFinalize(this);
Dispose(true);
}
/// <summary>
/// Destructor
/// </summary>
~CurlSlist()
{
Dispose(false);
}
/// <summary>
/// Append a string to the list.
/// </summary>
/// <param name="str">The <c>string</c> to append.</param>
public void Append(string str)
{
#if USE_LIBCURLSHIM
Handle = NativeMethods.curl_shim_add_string_to_slist(Handle, str);
#else
Handle = NativeMethods.curl_slist_append(Handle, str);
#endif
}
private void Dispose(bool disposing)
{
lock (this)
{
if (Handle != IntPtr.Zero)
{
#if USE_LIBCURLSHIM
NativeMethods.curl_shim_free_slist(Handle);
#else
NativeMethods.curl_slist_free_all(Handle);
#endif
Handle = IntPtr.Zero;
}
}
}
#if !USE_LIBCURLSHIM
[StructLayout(LayoutKind.Sequential)]
private class curl_slist
{
/// char*
[MarshalAs(UnmanagedType.LPStr)] public string data;
/// curl_slist*
public IntPtr next;
}
#endif
}
}

View File

@@ -1,45 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using CurlSharp.Callbacks;
namespace CurlSharp
{
/// <summary>
/// An instance of this class is passed to the delegate
/// <see cref="CurlSslContextCallback" />, if it's implemented.
/// Within that delegate, the code will have to make native calls to
/// the <c>OpenSSL</c> library with the value returned from the
/// <see cref="CurlSslContext.Context" /> property cast to an
/// <c>SSL_CTX</c> pointer.
/// </summary>
public sealed class CurlSslContext
{
internal CurlSslContext(IntPtr pvContext)
{
Context = pvContext;
}
/// <summary>
/// Get the underlying OpenSSL context.
/// </summary>
public IntPtr Context { get; }
}
}

View File

@@ -1,207 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
using System;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// This class wraps a <c>curl_version_info_data</c> struct. An instance is
/// obtained by calling <see cref="Curl.GetVersionInfo" />.
/// </summary>
public sealed class CurlVersionInfoData
{
private const int OFFSET_AGE = 0;
private const int OFFSET_VERSION = 4;
private const int OFFSET_VERSION_NUM = 8;
private const int OFFSET_HOST = 12;
private const int OFFSET_FEATURES = 16;
private const int OFFSET_SSL_VERSION = 20;
private const int OFFSET_SSL_VERSION_NUM = 24;
private const int OFFSET_LIBZ_VERSION = 28;
private const int OFFSET_PROTOCOLS = 32;
private const int OFFSET_ARES_VERSION = 36;
private const int OFFSET_ARES_VERSION_NUM = 40;
private const int OFFSET_LIBIDN_VERSION = 44;
private readonly IntPtr m_pVersionInfoData;
internal CurlVersionInfoData(CurlVersion ver)
{
m_pVersionInfoData = NativeMethods.curl_version_info(ver);
}
#if USE_LIBCURLSHIM
/// <summary>
/// Age of this struct, depending on how recent the linked-in
/// <c>libcurl</c> is, as a <see cref="CurlVersion" />.
/// </summary>
public CurlVersion Age
{
get { return (CurlVersion) NativeMethods.curl_shim_get_version_int_value(m_pVersionInfoData, OFFSET_AGE); }
}
/// <summary>
/// Get the internal cURL version, as a <c>string</c>.
/// </summary>
public string Version
{
get
{
return Marshal.PtrToStringAnsi(
NativeMethods.curl_shim_get_version_char_ptr(m_pVersionInfoData, OFFSET_VERSION));
}
}
/// <summary>
/// Get the internal cURL version number, a A 24-bit number created
/// like this: [8 bits major number] | [8 bits minor number] | [8
/// bits patch number]. For example, Version 7.12.2 is <c>0x070C02</c>.
/// </summary>
public int VersionNum
{
get { return NativeMethods.curl_shim_get_version_int_value(m_pVersionInfoData, OFFSET_VERSION_NUM); }
}
/// <summary>
/// Get the host information on which the underlying cURL was built.
/// </summary>
public string Host
{
get
{
return
Marshal.PtrToStringAnsi(NativeMethods.curl_shim_get_version_char_ptr(m_pVersionInfoData, OFFSET_HOST));
}
}
/// <summary>
/// Get a bitmask of features, containing bits or'd from the
/// <see cref="CurlVersionFeatureBitmask" /> enumeration.
/// </summary>
public int Features
{
get { return NativeMethods.curl_shim_get_version_int_value(m_pVersionInfoData, OFFSET_FEATURES); }
}
/// <summary>
/// Get the Ssl version, if it's linked in.
/// </summary>
public string SslVersion
{
get
{
return
Marshal.PtrToStringAnsi(NativeMethods.curl_shim_get_version_char_ptr(m_pVersionInfoData,
OFFSET_SSL_VERSION));
}
}
/// <summary>
/// Get the Ssl version number, if Ssl is linked in.
/// </summary>
public int SSLVersionNum
{
get { return NativeMethods.curl_shim_get_version_int_value(m_pVersionInfoData, OFFSET_SSL_VERSION_NUM); }
}
/// <summary>
/// Get the libz version, if libz is linked in.
/// </summary>
public string LibZVersion
{
get
{
return
Marshal.PtrToStringAnsi(NativeMethods.curl_shim_get_version_char_ptr(m_pVersionInfoData,
OFFSET_LIBZ_VERSION));
}
}
/// <summary>
/// Get the names of the supported protocols.
/// </summary>
public string[] Protocols
{
get
{
var nProts = NativeMethods.curl_shim_get_number_of_protocols(
m_pVersionInfoData, OFFSET_PROTOCOLS);
var aProts = new String[nProts];
for (var i = 0; i < nProts; i++)
{
aProts[i] =
Marshal.PtrToStringAnsi(NativeMethods.curl_shim_get_protocol_string(m_pVersionInfoData,
OFFSET_PROTOCOLS, i));
}
return aProts;
}
}
/// <summary>
/// Get the ARes version, if ARes is linked in.
/// </summary>
public string ARes
{
get
{
if (Age > CurlVersion.First)
{
return
Marshal.PtrToStringAnsi(NativeMethods.curl_shim_get_version_char_ptr(m_pVersionInfoData,
OFFSET_ARES_VERSION));
}
return "n.a.";
}
}
/// <summary>
/// Get the ARes version number, if ARes is linked in.
/// </summary>
public int AResNum
{
get
{
if (Age > CurlVersion.First)
{
return NativeMethods.curl_shim_get_version_int_value(m_pVersionInfoData, OFFSET_ARES_VERSION_NUM);
}
return 0;
}
}
/// <summary>
/// Get the libidn version, if libidn is linked in.
/// </summary>
public string LibIdn
{
get
{
if (Age > CurlVersion.Second)
{
return
Marshal.PtrToStringAnsi(NativeMethods.curl_shim_get_version_char_ptr(m_pVersionInfoData,
OFFSET_LIBIDN_VERSION));
}
return "n.a.";
}
}
#endif
}
}

View File

@@ -1,46 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains values used to specify the order in which cached connections
/// are closed. One of these is passed as the
/// <see cref="CurlOption.ClosePolicy" /> option in a call
/// to <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlClosePolicy
{
/// <summary>
/// No close policy. Never use this.
/// </summary>
None = 0,
/// <summary>
/// Close the oldest cached connections first.
/// </summary>
Oldest = 1,
/// <summary>
/// Close the least recently used connections first.
/// </summary>
LeastRecentlyUsed = 2,
/// <summary>
/// Close the connections with the least traffic first.
/// </summary>
LeastTraffic = 3,
/// <summary>
/// Close the slowest connections first.
/// </summary>
Slowest = 4,
/// <summary>
/// Currently unimplemented.
/// </summary>
Callback = 5,
/// <summary>
/// End-of-enumeration marker; do not use in application code.
/// </summary>
Last = 6
};
}

View File

@@ -1,403 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Status code returned from <see cref="CurlEasy" /> functions.
/// </summary>
public enum CurlCode
{
/// <summary>
/// All fine. Proceed as usual.
/// </summary>
Ok = 0,
/// <summary>
/// Aborted by callback. An internal callback returned "abort"
/// to libcurl.
/// </summary>
AbortedByCallback = 42,
/// <summary>
/// Internal error. A function was called in a bad order.
/// </summary>
BadCallingOrder = 44,
/// <summary>
/// Unrecognized transfer encoding.
/// </summary>
BadContentEncoding = 61,
/// <summary>
/// Attempting FTP resume beyond file size.
/// </summary>
BadDownloadResume = 36,
/// <summary>
/// Internal error. A function was called with a bad parameter.
/// </summary>
BadFunctionArgument = 43,
/// <summary>
/// Bad password entered. An error was signaled when the password was
/// entered. This can also be the result of a "bad password" returned
/// from a specified password callback.
/// </summary>
BadPasswordEntered = 46,
/// <summary>
/// Failed to connect to host or proxy.
/// </summary>
CouldntConnect = 7,
/// <summary>
/// Couldn't resolve host. The given remote host was not resolved.
/// </summary>
CouldntResolveHost = 6,
/// <summary>
/// Couldn't resolve proxy. The given proxy host could not be resolved.
/// </summary>
CouldntResolveProxy = 5,
/// <summary>
/// Very early initialization code failed. This is likely to be an
/// internal error or problem.
/// </summary>
FailedInit = 2,
/// <summary>
/// Maximum file size exceeded.
/// </summary>
FilesizeExceeded = 63,
/// <summary>
/// A file given with FILE:// couldn't be opened. Most likely
/// because the file path doesn't identify an existing file. Did
/// you check file permissions?
/// </summary>
FileCouldntReadFile = 37,
/// <summary>
/// We were denied access when trying to login to an FTP server or
/// when trying to change working directory to the one given in the URL.
/// </summary>
FtpAccessDenied = 9,
/// <summary>
/// An internal failure to lookup the host used for the new
/// connection.
/// </summary>
FtpCantGetHost = 15,
/// <summary>
/// A bad return code on either PASV or EPSV was sent by the FTP
/// server, preventing libcurl from being able to continue.
/// </summary>
FtpCantReconnect = 16,
/// <summary>
/// The FTP SIZE command returned error. SIZE is not a kosher FTP
/// command, it is an extension and not all servers support it. This
/// is not a surprising error.
/// </summary>
FtpCouldntGetSize = 32,
/// <summary>
/// This was either a weird reply to a 'RETR' command or a zero byte
/// transfer complete.
/// </summary>
FtpCouldntRetrFile = 19,
/// <summary>
/// libcurl failed to set ASCII transfer type (TYPE A).
/// </summary>
FtpCouldntSetAscii = 29,
/// <summary>
/// Received an error when trying to set the transfer mode to binary.
/// </summary>
FtpCouldntSetBinary = 17,
/// <summary>
/// FTP couldn't STOR file. The server denied the STOR operation.
/// The error buffer usually contains the server's explanation to this.
/// </summary>
FtpCouldntStorFile = 25,
/// <summary>
/// The FTP REST command returned error. This should never happen
/// if the server is sane.
/// </summary>
FtpCouldntUseRest = 31,
/// <summary>
/// The FTP PORT command returned error. This mostly happen when
/// you haven't specified a good enough address for libcurl to use.
/// See <see cref="CurlOption.FtpPort" />.
/// </summary>
FtpPortFailed = 30,
/// <summary>
/// When sending custom "QUOTE" commands to the remote server, one
/// of the commands returned an error code that was 400 or higher.
/// </summary>
FtpQuoteError = 21,
/// <summary>
/// Requested FTP Ssl level failed.
/// </summary>
FtpSslFailed = 64,
/// <summary>
/// The FTP server rejected access to the server after the password
/// was sent to it. It might be because the username and/or the
/// password were incorrect or just that the server is not allowing
/// you access for the moment etc.
/// </summary>
FtpUserPasswordIncorrect = 10,
/// <summary>
/// FTP servers return a 227-line as a response to a PASV command.
/// If libcurl fails to parse that line, this return code is
/// passed back.
/// </summary>
FtpWeird227Format = 14,
/// <summary>
/// After having sent the FTP password to the server, libcurl expects
/// a proper reply. This error code indicates that an unexpected code
/// was returned.
/// </summary>
FtpWeirdPassReply = 11,
/// <summary>
/// libcurl failed to get a sensible result back from the server as
/// a response to either a PASV or a EPSV command. The server is flawed.
/// </summary>
FtpWeirdPasvReply = 13,
/// <summary>
/// After connecting to an FTP server, libcurl expects to get a
/// certain reply back. This error code implies that it got a strange
/// or bad reply. The given remote server is probably not an
/// OK FTP server.
/// </summary>
FtpWeirdServerReply = 8,
/// <summary>
/// After having sent user name to the FTP server, libcurl expects a
/// proper reply. This error code indicates that an unexpected code
/// was returned.
/// </summary>
FtpWeirdUserReply = 12,
/// <summary>
/// After a completed file transfer, the FTP server did not respond a
/// proper "transfer successful" code.
/// </summary>
FtpWriteError = 20,
/// <summary>
/// Function not found. A required LDAP function was not found.
/// </summary>
FunctionNotFound = 41,
/// <summary>
/// Nothing was returned from the server, and under the circumstances,
/// getting nothing is considered an error.
/// </summary>
GotNothing = 52,
/// <summary>
/// This is an odd error that mainly occurs due to internal confusion.
/// </summary>
HttpPostError = 34,
/// <summary>
/// The HTTP server does not support or accept range requests.
/// </summary>
HttpRangeError = 33,
/// <summary>
/// This is returned if <see cref="CurlOption.FailOnError" />
/// is set TRUE and the HTTP server returns an error code that
/// is >= 400.
/// </summary>
HttpReturnedError = 22,
/// <summary>
/// Interface error. A specified outgoing interface could not be
/// used. Set which interface to use for outgoing connections'
/// source IP address with <see cref="CurlOption.Interface" />.
/// </summary>
InterfaceFailed = 45,
/// <summary>
/// End-of-enumeration marker; do not use in client applications.
/// </summary>
Last = 67,
/// <summary>
/// LDAP cannot bind. LDAP bind operation failed.
/// </summary>
LdapCannotBind = 38,
/// <summary>
/// Invalid LDAP URL.
/// </summary>
LdapInvalidUrl = 62,
/// <summary>
/// LDAP search failed.
/// </summary>
LdapSearchFailed = 39,
/// <summary>
/// Library not found. The LDAP library was not found.
/// </summary>
LibraryNotFound = 40,
/// <summary>
/// Malformat user. User name badly specified. *Not currently used*
/// </summary>
MalformatUser = 24,
/// <summary>
/// This is not an error. This used to be another error code in an
/// old libcurl version and is currently unused.
/// </summary>
Obsolete = 50,
/// <summary>
/// Operation timeout. The specified time-out period was reached
/// according to the conditions.
/// </summary>
OperationTimeouted = 28,
/// <summary>
/// Out of memory. A memory allocation request failed. This is serious
/// badness and things are severely messed up if this ever occurs.
/// </summary>
OutOfMemory = 27,
/// <summary>
/// A file transfer was shorter or larger than expected. This
/// happens when the server first reports an expected transfer size,
/// and then delivers data that doesn't match the previously
/// given size.
/// </summary>
PartialFile = 18,
/// <summary>
/// There was a problem reading a local file or an error returned by
/// the read callback.
/// </summary>
ReadError = 26,
/// <summary>
/// Failure with receiving network data.
/// </summary>
RecvError = 56,
/// <summary>
/// Failed sending network data.
/// </summary>
SendError = 55,
/// <summary>
/// Sending the data requires a rewind that failed.
/// </summary>
SendFailRewind = 65,
/// <summary>
/// CurlShare is in use.
/// </summary>
ShareInUse = 57,
/// <summary>
/// Problem with the CA cert (path? access rights?)
/// </summary>
SslCaCert = 60,
/// <summary>
/// There's a problem with the local client certificate.
/// </summary>
SslCertProblem = 58,
/// <summary>
/// Couldn't use specified cipher.
/// </summary>
SslCipher = 59,
/// <summary>
/// A problem occurred somewhere in the Ssl/TLS handshake. You really
/// want to use the <see cref="CurlEasy.CurlDebugCallback" /> delegate and read
/// the message there as it pinpoints the problem slightly more. It
/// could be certificates (file formats, paths, permissions),
/// passwords, and others.
/// </summary>
SslConnectError = 35,
/// <summary>
/// Failed to initialize Ssl engine.
/// </summary>
SslEngineInitFailed = 66,
/// <summary>
/// The specified crypto engine wasn't found.
/// </summary>
SslEngineNotFound = 53,
/// <summary>
/// Failed setting the selected Ssl crypto engine as default!
/// </summary>
SslEngineSetFailed = 54,
/// <summary>
/// The remote server's Ssl certificate was deemed not OK.
/// </summary>
SslPeerCertificate = 51,
/// <summary>
/// A telnet option string was improperly formatted.
/// </summary>
TelnetOptionSyntax = 49,
/// <summary>
/// Too many redirects. When following redirects, libcurl hit the
/// maximum amount. Set your limit with
/// <see cref="CurlOption.MaxRedirs" />.
/// </summary>
TooManyRedirects = 47,
/// <summary>
/// An option set with <see cref="CurlOption.TelnetOptions" />
/// was not recognized/known. Refer to the appropriate documentation.
/// </summary>
UnknownTelnetOption = 48,
/// <summary>
/// The URL you passed to libcurl used a protocol that this libcurl
/// does not support. The support might be a compile-time option that
/// wasn't used, it can be a misspelled protocol string or just a
/// protocol libcurl has no code for.
/// </summary>
UnsupportedProtocol = 1,
/// <summary>
/// The URL was not properly formatted.
/// </summary>
UrlMalformat = 3,
/// <summary>
/// URL user malformatted. The user-part of the URL syntax was not
/// correct.
/// </summary>
UrlMalformatUser = 4,
/// <summary>
/// An error occurred when writing received data to a local file,
/// or an error was returned to libcurl from a write callback.
/// </summary>
WriteError = 23,
};
}

View File

@@ -1,76 +0,0 @@
/***************************************************************************
*
* Project: libcurl.NET
*
* Copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
* $Id: Enums.cs,v 1.1 2005/02/17 22:47:25 jeffreyphillips Exp $
**************************************************************************/
namespace CurlSharp.Enums
{
/// <summary>
/// One of these is returned by <see cref="CurlHttpMultiPartForm.AddSection" />.
/// </summary>
public enum CurlFormCode
{
/// <summary>
/// The section was added properly.
/// </summary>
Ok = 0,
/// <summary>
/// Out-of-memory when adding the section.
/// </summary>
Memory = 1,
/// <summary>
/// Invalid attempt to add the same option more than once to a
/// section.
/// </summary>
OptionTwice = 2,
/// <summary>
/// Invalid attempt to pass a <c>null</c> string or byte array in
/// one of the arguments.
/// </summary>
Null = 3,
/// <summary>
/// Invalid attempt to pass an unrecognized option in one of the
/// arguments.
/// </summary>
UnknownOption = 4,
/// <summary>
/// Incomplete argument lists.
/// </summary>
Incomplete = 5,
/// <summary>
/// Invalid attempt to provide a nested <c>Array</c>.
/// </summary>
IllegalArray = 6,
/// <summary>
/// This will not be returned so long as HTTP is enabled, which
/// it always is in libcurl.NET.
/// </summary>
Disabled = 7,
/// <summary>
/// End-of-enumeration marker; do not use in application code.
/// </summary>
Last = 8
};
}

View File

@@ -1,142 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// These are options available to build a multi-part form section
/// in a call to <see cref="CurlHttpMultiPartForm.AddSection" />
/// </summary>
public enum CurlFormOption
{
/// <summary>
/// Another possibility to send options to
/// <see cref="CurlHttpMultiPartForm.AddSection" /> is this option, that
/// passes a <see cref="CurlForms" /> array reference as its value.
/// Each <see cref="CurlForms" /> array element has a
/// <see cref="CurlFormOption" /> and a <c>string</c>. All available
/// options can be used in an array, except the <c>Array</c>
/// option itself! The last argument in such an array must always be
/// <c>End</c>.
/// </summary>
Array = 8,
/// <summary>
/// Followed by a <c>string</c>, tells libcurl that a buffer is to be
/// used to upload data instead of using a file.
/// </summary>
Buffer = 11,
/// <summary>
/// Followed by an <c>int</c> with the size of the
/// <c>BufferPtr</c> byte array, tells libcurl the length of
/// the data to upload.
/// </summary>
BufferLength = 13,
/// <summary>
/// Followed by a <c>byte[]</c> array, tells libcurl the address of
/// the buffer containing data to upload (as indicated with
/// <c>Buffer</c>). You must also use
/// <c>BufferLength</c> to set the length of the buffer area.
/// </summary>
BufferPtr = 12,
/// <summary>
/// Specifies extra headers for the form POST section. This takes an
/// <see cref="CurlSlist" /> prepared in the usual way using
/// <see cref="CurlSlist.Append" /> and appends the list of headers to
/// those libcurl automatically generates.
/// </summary>
ContentHeader = 15,
/// <summary>
/// Followed by an <c>int</c> setting the length of the contents.
/// </summary>
ContentsLength = 6,
/// <summary>
/// Followed by a <c>string</c> with a content-type will make cURL
/// use this given content-type for this file upload part, possibly
/// instead of an internally chosen one.
/// </summary>
ContentType = 14,
/// <summary>
/// Followed by a <c>string</c> is used for the contents of this part, the
/// actual data to send away. If you'd like it to contain zero bytes,
/// you need to set the length of the name with
/// <c>ContentsLength</c>.
/// </summary>
CopyContents = 4,
/// <summary>
/// Followed by a <c>string</c> used to set the name of this part.
/// If you'd like it to contain zero bytes, you need to set the
/// length of the name with <c>NameLength</c>.
/// </summary>
CopyName = 1,
/// <summary>
/// This should be the last argument to a call to
/// <see cref="CurlHttpMultiPartForm.AddSection" />.
/// </summary>
End = 17,
/// <summary>
/// Followed by a file name, makes this part a file upload part. It
/// sets the file name field to the actual file name used here,
/// it gets the contents of the file and passes as data and sets the
/// content-type if the given file match one of the new internally
/// known file extension. For <c>File</c> the user may send
/// one or more files in one part by providing multiple <c>File</c>
/// arguments each followed by the filename (and each <c>File</c>
/// is allowed to have a <c>ContentType</c>).
/// </summary>
File = 10,
/// <summary>
/// Followed by a file name, and does the file read: the contents
/// will be used in as data in this part.
/// </summary>
FileContent = 7,
/// <summary>
/// Followed by a <c>string</c> file name, will make libcurl use the
/// given name in the file upload part, instead of the actual file
/// name given to <c>File</c>.
/// </summary>
Filename = 16,
/// <summary>
/// Followed by an <c>int</c> setting the length of the name.
/// </summary>
NameLength = 3,
/// <summary>
/// Not used.
/// </summary>
Nothing = 0,
/// <summary>
/// No longer used.
/// </summary>
Obsolete = 9,
/// <summary>
/// No longer used.
/// </summary>
Obsolete2 = 18,
/// <summary>
/// Followed by a <c>byte[]</c> used for the contents of this part.
/// If you'd like it to contain zero bytes, you need to set the
/// length of the name with <c>ContentsLength</c>.
/// </summary>
PtrContents = 5,
/// <summary>
/// Followed by a <c>byte[]</c> used for the name of this part.
/// If you'd like it to contain zero bytes, you need to set the
/// length of the name with <c>NameLength</c>.
/// </summary>
PtrName = 2
};
}

View File

@@ -1,31 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// This enumeration contains values used to specify the FTP Ssl
/// authorization level using the
/// <see cref="CurlOption.FtpSslAuth" /> option when calling
/// <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlFtpAuth
{
/// <summary>
/// Let <c>libcurl</c> decide on the authorization scheme.
/// </summary>
Default = 0,
/// <summary>
/// Use "AUTH Ssl".
/// </summary>
SSL = 1,
/// <summary>
/// Use "AUTH TLS".
/// </summary>
TLS = 2,
/// <summary>
/// End-of-enumeration marker. Do not use in a client application.
/// </summary>
Last = 3
};
}

View File

@@ -1,37 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// This enumeration contains values used to specify the FTP Ssl level
/// using the <see cref="CurlOption.FtpSsl" /> option when calling
/// <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlFtpSsl
{
/// <summary>
/// Don't attempt to use Ssl.
/// </summary>
None = 0,
/// <summary>
/// Try using Ssl, proceed as normal otherwise.
/// </summary>
Try = 1,
/// <summary>
/// Require Ssl for the control connection or fail with
/// <see cref="CurlCode.FtpSslFailed" />.
/// </summary>
Control = 2,
/// <summary>
/// Require Ssl for all communication or fail with
/// <see cref="CurlCode.FtpSslFailed" />.
/// </summary>
All = 3,
/// <summary>
/// End-of-enumeration marker. Do not use in a client application.
/// </summary>
Last = 4
};
}

View File

@@ -1,65 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// This enumeration contains values used to specify the HTTP authentication
/// when using the <see cref="CurlOption.HttpAuth" /> option when
/// calling <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlHttpAuth
{
/// <summary>
/// No authentication.
/// </summary>
None = 0,
/// <summary>
/// HTTP Basic authentication. This is the default choice, and the
/// only method that is in wide-spread use and supported virtually
/// everywhere. This is sending the user name and password over the
/// network in plain text, easily captured by others.
/// </summary>
Basic = 1,
/// <summary>
/// HTTP Digest authentication. Digest authentication is defined
/// in RFC2617 and is a more secure way to do authentication over
/// public networks than the regular old-fashioned Basic method.
/// </summary>
Digest = 2,
/// <summary>
/// HTTP GSS-Negotiate authentication. The GSS-Negotiate (also known
/// as plain "Negotiate") method was designed by Microsoft and is
/// used in their web applications. It is primarily meant as a
/// support for Kerberos5 authentication but may be also used along
/// with another authentication methods. For more information see IETF
/// draft draft-brezak-spnego-http-04.txt.
/// <note>
/// You need to use a version of libcurl.NET built with a suitable
/// GSS-API library for this to work. This is not currently standard.
/// </note>
/// </summary>
GssNegotiate = 4,
/// <summary>
/// HTTP Ntlm authentication. A proprietary protocol invented and
/// used by Microsoft. It uses a challenge-response and hash concept
/// similar to Digest, to prevent the password from being eavesdropped.
/// </summary>
Ntlm = 8,
/// <summary>
/// This is a convenience macro that sets all bits and thus makes
/// libcurl pick any it finds suitable. libcurl will automatically
/// select the one it finds most secure.
/// </summary>
Any = 15, // ~0
/// <summary>
/// This is a convenience macro that sets all bits except Basic
/// and thus makes libcurl pick any it finds suitable. libcurl
/// will automatically select the one it finds most secure.
/// </summary>
AnySafe = 14 // ~Basic
};
}

View File

@@ -1,46 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains values used to specify the HTTP version level when using
/// the <see cref="CurlOption.HttpVersion" /> option in a call
/// to <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlHttpVersion
{
/// <summary>
/// We don't care about what version the library uses. libcurl will
/// use whatever it thinks fit.
/// </summary>
None = 0,
/// <summary>
/// Enforce HTTP 1.0 requests.
/// </summary>
Http1_0 = 1,
/// <summary>
/// Enforce HTTP 1.1 requests.
/// </summary>
Http1_1 = 2,
/// <summary>
/// Enforce HTTP 2 requests.
/// </summary>
Http2_0 = 3,
/// <summary>
/// Enforce version 2 for HTTPS, version 1.1 for HTTP.
/// </summary>
Http2_Tls = 4,
/// <summary>
/// Enforce HTTP 2 without HTTP/1.1 upgrade.
/// </summary>
Http2_PriorKnowledge = 5,
/// <summary>
/// Last entry in enumeration; do not use in application code.
/// </summary>
Last = 6
}
}

View File

@@ -1,222 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// This enumeration is used to extract information associated with an
/// <see cref="CurlEasy" /> transfer. Specifically, a member of this
/// enumeration is passed as the first argument to
/// CurlEasy.GetInfo specifying the item to retrieve in the
/// second argument, which is a reference to an <c>int</c>, a
/// <c>double</c>, a <c>string</c>, a <c>DateTime</c> or an <c>object</c>.
/// </summary>
public enum CurlInfo
{
/// <summary>
/// The second argument receives the elapsed time, as a <c>double</c>,
/// in seconds, from the start until the connect to the remote host
/// (or proxy) was completed.
/// </summary>
ConnectTime = 0x300005,
/// <summary>
/// The second argument receives, as a <c>double</c>, the content-length
/// of the download. This is the value read from the Content-Length: field.
/// </summary>
ContentLengthDownload = 0x30000F,
/// <summary>
/// The second argument receives, as a <c>double</c>, the specified size
/// of the upload.
/// </summary>
ContentLengthUpload = 0x300010,
/// <summary>
/// The second argument receives, as a <c>string</c>, the content-type of
/// the downloaded object. This is the value read from the Content-Type:
/// field. If you get <c>null</c>, it means that the server didn't
/// send a valid Content-Type header or that the protocol used
/// doesn't support this.
/// </summary>
ContentType = 0x100012,
/// <summary>
/// The second argument receives, as a <c>string</c>, the last
/// used effective URL.
/// </summary>
EffectiveUrl = 0x100001,
/// <summary>
/// The second argument receives, as a <c>long</c>, the remote time
/// of the retrieved document. You should construct a <c>DateTime</c>
/// from this value, as shown in the <c>InfoDemo</c> sample. If you
/// get a date in the distant
/// past, it can be because of many reasons (unknown, the server
/// hides it or the server doesn't support the command that tells
/// document time etc) and the time of the document is unknown. Note
/// that you must tell the server to collect this information before
/// the transfer is made, by using the
/// <see cref="CurlOption.Filetime" /> option to
/// <see cref="CurlEasy.SetOpt" />. (Added in 7.5)
/// </summary>
Filetime = 0x20000E,
/// <summary>
/// The second argument receives an <c>int</c> specifying the total size
/// of all the headers received.
/// </summary>
HeaderSize = 0x20000B,
/// <summary>
/// The second argument receives, as an <c>int</c>, a bitmask indicating
/// the authentication method(s) available. The meaning of the bits is
/// explained in the documentation of
/// <see cref="CurlOption.HttpAuth" />. (Added in 7.10.8)
/// </summary>
HttpAuthAvail = 0x200017,
/// <summary>
/// The second argument receives an <c>int</c> indicating the numeric
/// connect code for the HTTP request.
/// </summary>
HttpConnectCode = 0x200016,
/// <summary>
/// End-of-enumeration marker; do not use in client applications.
/// </summary>
LastOne = 0x1C,
/// <summary>
/// The second argument receives, as a <c>double</c>, the time, in
/// seconds it took from the start until the name resolving was
/// completed.
/// </summary>
NameLookupTime = 0x300004,
/// <summary>
/// Never used.
/// </summary>
None = 0x0,
/// <summary>
/// The second argument receives an <c>int</c> indicating the
/// number of current connections. (Added in 7.13.0)
/// </summary>
NumConnects = 0x20001A,
/// <summary>
/// The second argument receives an <c>int</c> indicating the operating
/// system error number: <c>_errro</c> or <c>GetLastError()</c>,
/// depending on the platform. (Added in 7.12.2)
/// </summary>
OsErrno = 0x200019,
/// <summary>
/// The second argument receives, as a <c>double</c>, the time, in
/// seconds, it took from the start until the file transfer is just about
/// to begin. This includes all pre-transfer commands and negotiations
/// that are specific to the particular protocol(s) involved.
/// </summary>
PreTransferTime = 0x300006,
/// <summary>
/// The second argument receives a reference to the private data
/// associated with the <see cref="CurlEasy" /> object (set with the
/// <see cref="CurlOption.Private" /> option to
/// <see cref="CurlEasy.SetOpt" />. (Added in 7.10.3)
/// </summary>
Private = 0x100015,
/// <summary>
/// The second argument receives, as an <c>int</c>, a bitmask
/// indicating the authentication method(s) available for your
/// proxy authentication. This will be a bitmask of
/// <see cref="CurlHttpAuth" /> enumeration constants.
/// (Added in 7.10.8)
/// </summary>
ProxyAuthAvail = 0x200018,
/// <summary>
/// The second argument receives an <c>int</c> indicating the total
/// number of redirections that were actually followed. (Added in 7.9.7)
/// </summary>
RedirectCount = 0x200014,
/// <summary>
/// The second argument receives, as a <c>double</c>, the total time, in
/// seconds, for all redirection steps include name lookup, connect,
/// pretransfer and transfer before final transaction was started.
/// <c>RedirectTime</c> contains the complete execution
/// time for multiple redirections. (Added in 7.9.7)
/// </summary>
RedirectTime = 0x300013,
/// <summary>
/// The second argument receives an <c>int</c> containing the total size
/// of the issued requests. This is so far only for HTTP requests. Note
/// that this may be more than one request if
/// <see cref="CurlOption.FollowLocation" /> is <c>true</c>.
/// </summary>
RequestSize = 0x20000C,
/// <summary>
/// The second argument receives an <c>int</c> with the last received HTTP
/// or FTP code. This option was known as <c>CURLINFO_HTTP_CODE</c> in
/// libcurl 7.10.7 and earlier.
/// </summary>
ResponseCode = 0x200002,
/// <summary>
/// The second argument receives a <c>double</c> with the total amount of
/// bytes that were downloaded. The amount is only for the latest transfer
/// and will be reset again for each new transfer.
/// </summary>
SizeDownload = 0x300008,
/// <summary>
/// The second argument receives a <c>double</c> with the total amount
/// of bytes that were uploaded.
/// </summary>
SizeUpload = 0x300007,
/// <summary>
/// The second argument receives a <c>double</c> with the average
/// download speed that cURL measured for the complete download.
/// </summary>
SpeedDownload = 0x300009,
/// <summary>
/// The second argument receives a <c>double</c> with the average
/// upload speed that libcurl measured for the complete upload.
/// </summary>
SpeedUpload = 0x30000A,
/// <summary>
/// The second argument receives an <see cref="CurlSlist" /> containing
/// the names of the available Ssl engines.
/// </summary>
SslEngines = 0x40001B,
/// <summary>
/// The second argument receives an <c>int</c> with the result of
/// the certificate verification that was requested (using the
/// <see cref="CurlOption.SslVerifyPeer" /> option in
/// <see cref="CurlEasy.SetOpt" />.
/// </summary>
SslVerifyResult = 0x20000D,
/// <summary>
/// The second argument receives a <c>double</c> specifying the time,
/// in seconds, from the start until the first byte is just about to be
/// transferred. This includes <c>PreTransferTime</c> and
/// also the time the server needs to calculate the result.
/// </summary>
StartTransferTime = 0x300011,
/// <summary>
/// The second argument receives a <c>double</c> indicating the total transaction
/// time in seconds for the previous transfer. This time does not include
/// the connect time, so if you want the complete operation time,
/// you should add the <c>ConnectTime</c>.
/// </summary>
TotalTime = 0x300003,
};
}

View File

@@ -1,50 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// A member of this enumeration is passed as the first parameter to the
/// <see cref="CurlEasy.CurlDebugCallback" /> delegate to which libcurl passes
/// debug messages.
/// </summary>
public enum CurlInfoType
{
/// <summary>
/// The data is informational text.
/// </summary>
Text = 0,
/// <summary>
/// The data is header (or header-like) data received from the peer.
/// </summary>
HeaderIn = 1,
/// <summary>
/// The data is header (or header-like) data sent to the peer.
/// </summary>
HeaderOut = 2,
/// <summary>
/// The data is protocol data received from the peer.
/// </summary>
DataIn = 3,
/// <summary>
/// The data is protocol data sent to the peer.
/// </summary>
DataOut = 4,
/// <summary>
/// The data is Ssl-related data sent to the peer.
/// </summary>
SslDataIn = 5,
/// <summary>
/// The data is Ssl-related data received from the peer.
/// </summary>
SslDataOut = 6,
/// <summary>
/// End of enumeration marker, don't use in a client application.
/// </summary>
End = 7
};
}

View File

@@ -1,34 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains values used to initialize libcurl internally. One of
/// these is passed in the call to <see cref="Curl.GlobalInit" />.
/// </summary>
public enum CurlInitFlag
{
/// <summary>
/// Initialise nothing extra. This sets no bit.
/// </summary>
Nothing = 0,
/// <summary>
/// Initialize Ssl.
/// </summary>
Ssl = 1,
/// <summary>
/// Initialize the Win32 socket libraries.
/// </summary>
Win32 = 2,
/// <summary>
/// Initialize everything possible. This sets all known bits.
/// </summary>
All = 3,
/// <summary>
/// Equivalent to <c>All</c>.
/// </summary>
Default = All
};
}

View File

@@ -1,27 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Your handler for the <see cref="CurlEasy.CurlIoctlCallback" />
/// delegate is passed one of these values as its first parameter.
/// Right now, the only supported value is
/// <code>RestartRead</code>.
/// </summary>
public enum CurlIoCommand
{
/// <summary>
/// No IOCTL operation; we should never see this.
/// </summary>
Nop = 0,
/// <summary>
/// When this is sent, your callback may need to, for example,
/// rewind a local file that is being sent via FTP.
/// </summary>
RestartRead = 1,
/// <summary>
/// End of enumeration marker, don't use in a client application.
/// </summary>
Last = 2
}
}

View File

@@ -1,30 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Your handler for the <see cref="CurlEasy.CurlIoctlCallback" /> delegate
/// should return a member of this enumeration.
/// </summary>
public enum CurlIoError
{
/// <summary>
/// Indicate that the callback processed everything okay.
/// </summary>
Ok = 0,
/// <summary>
/// Unknown command sent to callback. Right now, only
/// <code>RestartRead</code> is supported.
/// </summary>
UnknownCommand = 1,
/// <summary>
/// Indicate to libcurl that a restart failed.
/// </summary>
FailRestart = 2,
/// <summary>
/// End of enumeration marker, don't use in a client application.
/// </summary>
Last = 3
}
}

View File

@@ -1,26 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// This enumeration contains values used to specify the IP resolution
/// method when using the <see cref="CurlOption.IpResolve" />
/// option in a call to <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlIpResolve
{
/// <summary>
/// Default, resolves addresses to all IP versions that your system
/// allows.
/// </summary>
Whatever = 0,
/// <summary>
/// Resolve to ipv4 addresses.
/// </summary>
V4 = 1,
/// <summary>
/// Resolve to ipv6 addresses.
/// </summary>
V6 = 2
};
}

View File

@@ -1,31 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Values containing the type of shared access requested when libcurl
/// calls the <see cref="CurlShare.CurlShareLockCallback" /> delegate.
/// </summary>
public enum CurlLockAccess
{
/// <summary>
/// Unspecified action; the delegate should never receive this.
/// </summary>
None = 0,
/// <summary>
/// The delegate receives this call when libcurl is requesting
/// read access to the shared resource.
/// </summary>
Shared = 1,
/// <summary>
/// The delegate receives this call when libcurl is requesting
/// write access to the shared resource.
/// </summary>
Single = 2,
/// <summary>
/// End-of-enumeration marker; do not use in application code.
/// </summary>
Last = 3
};
}

View File

@@ -1,48 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Members of this enumeration should be passed to
/// <see cref="CurlShare.SetOpt" /> when it is called with the
/// <c>CurlShare</c> or <c>Unshare</c> options
/// provided in the <see cref="CurlShareOption" /> enumeration.
/// </summary>
public enum CurlLockData
{
/// <summary>
/// Not used.
/// </summary>
None = 0,
/// <summary>
/// Used internally by libcurl.
/// </summary>
Share = 1,
/// <summary>
/// Cookie data will be shared across the <see cref="CurlEasy" /> objects
/// using this shared object.
/// </summary>
Cookie = 2,
/// <summary>
/// Cached Dns hosts will be shared across the <see cref="CurlEasy" />
/// objects using this shared object.
/// </summary>
Dns = 3,
/// <summary>
/// Not supported yet.
/// </summary>
SslSession = 4,
/// <summary>
/// Not supported yet.
/// </summary>
Connect = 5,
/// <summary>
/// End-of-enumeration marker; do not use in application code.
/// </summary>
Last = 6
};
}

View File

@@ -1,25 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// The status code associated with an <see cref="CurlEasy" /> object in a
/// <see cref="CurlMulti" /> operation. One of these is returned in response
/// to reading the <see cref="CurlMultiInfo.Msg" /> property.
/// </summary>
public enum CurlMessage
{
/// <summary>
/// First entry in the enumeration, not used.
/// </summary>
None = 0,
/// <summary>
/// The associated <see cref="CurlEasy" /> object completed.
/// </summary>
Done = 1,
/// <summary>
/// End-of-enumeration marker, not used.
/// </summary>
Last = 2
};
}

View File

@@ -1,46 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains return codes for many of the functions in the
/// <see cref="CurlMulti" /> class.
/// </summary>
public enum CurlMultiCode
{
/// <summary>
/// You should call <see cref="CurlMulti.Perform" /> again before calling
/// <see cref="CurlMulti.Select" />.
/// </summary>
CallMultiPerform = -1,
/// <summary>
/// The function succeded.
/// </summary>
Ok = 0,
/// <summary>
/// The internal <see cref="CurlMulti" /> is bad.
/// </summary>
BadHandle = 1,
/// <summary>
/// One of the <see cref="CurlEasy" /> handles associated with the
/// <see cref="CurlMulti" /> object is bad.
/// </summary>
BadEasyHandle = 2,
/// <summary>
/// Out of memory. This is a severe problem.
/// </summary>
OutOfMemory = 3,
/// <summary>
/// Internal error deep within the libcurl library.
/// </summary>
InternalError = 4,
/// <summary>
/// End-of-enumeration marker, not used.
/// </summary>
Last = 5
};
}

View File

@@ -1,46 +0,0 @@
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
//
// Copyright (c) 2017, Dr. Masroor Ehsan. All rights reserved.
//
// $Id:$
//
// Last modified: 25.01.2017 1:29 AM
//
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
namespace CurlSharp.Enums
{
public enum CurlMultiOption
{
/* This is the socket callback function pointer */
SocketFunction = CurlOptType.FunctionPoint + 1,
/* This is the argument passed to the socket callback */
SocketData = CurlOptType.ObjectPoint + 2,
/* set to 1 to enable pipelining for this multi handle */
Pipelining = CurlOptType.Long + 3,
/* This is the timer callback function pointer */
TimerFunction = CurlOptType.FunctionPoint + 4,
/* This is the argument passed to the timer callback */
TimerDate = CurlOptType.ObjectPoint + 5,
/* maximum number of entries in the connection cache */
MaxConnects = CurlOptType.Long + 6,
/* maximum number of (pipelining) connections to one host */
MaxHostConnections = CurlOptType.Long + 7,
/* maximum number of requests in a pipeline */
MaxPipelineLength = CurlOptType.Long + 8,
/* a connection with a content-length longer than this will not be considered for pipelining */
ContentLengthPenaltySize = CurlOptType.Offset + 9,
/* a connection with a chunk length longer than this will not be considered for pipelining */
ChunkLengthPenaltySize = CurlOptType.Offset + 10,
/* a list of site names(+port) that are blacklisted from pipelining */
PipeliningSiteBlackList = CurlOptType.ObjectPoint + 11,
/* a list of server types that are blacklisted from pipelining */
PipeliningServerBlackList = CurlOptType.ObjectPoint + 12,
/* maximum number of open connections in total */
MaxTotalConnections = CurlOptType.Long + 13,
/* This is the server push callback function pointer */
PushFunction = CurlOptType.FunctionPoint + 14,
/* This is the argument passed to the server push callback */
PushData = CurlOptType.ObjectPoint + 15
}
}

View File

@@ -1,43 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains values used to specify the preference of libcurl between
/// using user names and passwords from your ~/.netrc file, relative to
/// user names and passwords in the URL supplied with
/// <see cref="CurlOption.Url" />. This is passed when using
/// the <see cref="CurlOption.Netrc" /> option in a call
/// to <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlNetrcOption
{
/// <summary>
/// The library will ignore the file and use only the information
/// in the URL. This is the default.
/// </summary>
Ignored = 0,
/// <summary>
/// The use of your ~/.netrc file is optional, and information in the
/// URL is to be preferred. The file will be scanned with the host
/// and user name (to find the password only) or with the host only,
/// to find the first user name and password after that machine,
/// which ever information is not specified in the URL.
/// <para>
/// Undefined values of the option will have this effect.
/// </para>
/// </summary>
Optional = 1,
/// <summary>
/// This value tells the library that use of the file is required,
/// to ignore the information in the URL, and to search the file
/// with the host only.
/// </summary>
Required = 2,
/// <summary>
/// Last entry in enumeration; do not use in application code.
/// </summary>
Last = 3
};
}

View File

@@ -1,21 +0,0 @@
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
//
// Copyright (c) 2017, Dr. Masroor Ehsan. All rights reserved.
//
// $Id:$
//
// Last modified: 25.01.2017 1:31 AM
//
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
namespace CurlSharp.Enums
{
public enum CurlOptType
{
Long = 0,
ObjectPoint = 10000,
StringPoint = 10000,
FunctionPoint = 20000,
Offset = 30000
}
}

File diff suppressed because it is too large Load Diff

View File

@@ -1,21 +0,0 @@
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
//
// Copyright (c) 2017, Dr. Masroor Ehsan. All rights reserved.
//
// $Id:$
//
// Last modified: 25.01.2017 1:23 AM
//
// * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
namespace CurlSharp.Enums
{
/* bitmask bits for CURLMOPT_PIPELINING */
public enum CurlPipelining : long
{
Nothing = 0,
Http1 = 1,
Multiplex = 2
}
}

View File

@@ -1,27 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// This enumeration contains values used to specify the proxy type when
/// using the <see cref="CurlOption.Proxy" /> option when calling
/// <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlProxyType
{
/// <summary>
/// Ordinary HTTP proxy.
/// </summary>
Http = 0,
/// <summary>
/// Use if the proxy supports SOCKS4 user authentication. If you're
/// unfamiliar with this, consult your network administrator.
/// </summary>
Socks4 = 4,
/// <summary>
/// Use if the proxy supports SOCKS5 user authentication. If you're
/// unfamiliar with this, consult your network administrator.
/// </summary>
Socks5 = 5
};
}

View File

@@ -1,40 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains return codes from many of the functions in the
/// <see cref="CurlShare" /> class.
/// </summary>
public enum CurlShareCode
{
/// <summary>
/// The function succeeded.
/// </summary>
Ok = 0,
/// <summary>
/// A bad option was passed to <see cref="CurlShare.SetOpt" />.
/// </summary>
BadOption = 1,
/// <summary>
/// An attempt was made to pass an option to
/// <see cref="CurlShare.SetOpt" /> while the CurlShare object is in use.
/// </summary>
InUse = 2,
/// <summary>
/// The <see cref="CurlShare" /> object's internal handle is invalid.
/// </summary>
Invalid = 3,
/// <summary>
/// Out of memory. This is a severe problem.
/// </summary>
NoMem = 4,
/// <summary>
/// End-of-enumeration marker; do not use in application code.
/// </summary>
Last = 5
};
}

View File

@@ -1,53 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// A member of this enumeration is passed to the function
/// <see cref="CurlShare.SetOpt" /> to configure a <see cref="CurlShare" />
/// transfer.
/// </summary>
public enum CurlShareOption
{
/// <summary>
/// Start-of-enumeration; do not use in application code.
/// </summary>
None = 0,
/// <summary>
/// The parameter, which should be a member of the
/// <see cref="CurlLockData" /> enumeration, specifies a type of
/// data that should be shared.
/// </summary>
Share = 1,
/// <summary>
/// The parameter, which should be a member of the
/// <see cref="CurlLockData" /> enumeration, specifies a type of
/// data that should be unshared.
/// </summary>
Unshare = 2,
/// <summary>
/// The parameter should be a reference to a
/// <see cref="CurlShare.CurlShareLockCallback" /> delegate.
/// </summary>
LockFunction = 3,
/// <summary>
/// The parameter should be a reference to a
/// <see cref="CurlShare.CurlShareUnlockCallback" /> delegate.
/// </summary>
UnlockFunction = 4,
/// <summary>
/// The parameter allows you to specify an object reference that
/// will passed to the <see cref="CurlShare.CurlShareLockCallback" /> delegate and
/// the <see cref="CurlShare.CurlShareUnlockCallback" /> delegate.
/// </summary>
UserData = 5,
/// <summary>
/// End-of-enumeration; do not use in application code.
/// </summary>
Last = 6
};
}

View File

@@ -1,36 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains values used to specify the Ssl version level when using
/// the <see cref="CurlOption.SslVersion" /> option in a call
/// to <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlSslVersion
{
/// <summary>
/// Use whatever version the Ssl library selects.
/// </summary>
Default = 0,
/// <summary>
/// Use TLS version 1.
/// </summary>
Tlsv1 = 1,
/// <summary>
/// Use Ssl version 2. This is not a good option unless it's the
/// only version supported by the remote server.
/// </summary>
Sslv2 = 2,
/// <summary>
/// Use Ssl version 3. This is a preferred option.
/// </summary>
Sslv3 = 3,
/// <summary>
/// Last entry in enumeration; do not use in application code.
/// </summary>
Last = 4
};
}

View File

@@ -1,39 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// Contains values used to specify the time condition when using
/// the <see cref="CurlOption.TimeCondition" /> option in a call
/// to <see cref="CurlEasy.SetOpt" />
/// </summary>
public enum CurlTimeCond
{
/// <summary>
/// Use no time condition.
/// </summary>
None = 0,
/// <summary>
/// The time condition is true if the resource has been modified
/// since the date/time passed in
/// <see cref="CurlOption.TimeValue" />.
/// </summary>
IfModSince = 1,
/// <summary>
/// True if the resource has not been modified since the date/time
/// passed in <see cref="CurlOption.TimeValue" />.
/// </summary>
IfUnmodSince = 2,
/// <summary>
/// True if the resource's last modification date/time equals that
/// passed in <see cref="CurlOption.TimeValue" />.
/// </summary>
LastMod = 3,
/// <summary>
/// Last entry in enumeration; do not use in application code.
/// </summary>
Last = 4
};
}

View File

@@ -1,34 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// A member of this enumeration is passed to the function
/// <see cref="Curl.GetVersionInfo" />
/// </summary>
public enum CurlVersion
{
/// <summary>
/// Capabilities associated with the initial version of libcurl.
/// </summary>
First = 0,
/// <summary>
/// Capabilities associated with the second version of libcurl.
/// </summary>
Second = 1,
/// <summary>
/// Capabilities associated with the third version of libcurl.
/// </summary>
Third = 2,
/// <summary>
/// Same as <c>Third</c>.
/// </summary>
Now = Third,
/// <summary>
/// End-of-enumeration marker; do not use in application code.
/// </summary>
Last = 3
};
}

View File

@@ -1,71 +0,0 @@
namespace CurlSharp.Enums
{
/// <summary>
/// A bitmask of libcurl features OR'd together as the value of the
/// property <see cref="CurlVersionInfoData.Features" />. The feature
/// bits are summarized in the table below.
/// </summary>
public enum CurlVersionFeatureBitmask
{
/// <summary>
/// Supports Ipv6.
/// </summary>
Ipv6 = 0x01,
/// <summary>
/// Supports kerberos4 (when using FTP).
/// </summary>
Kerberos64 = 0x02,
/// <summary>
/// Supports Ssl (HTTPS/FTPS).
/// </summary>
Ssl = 0x04,
/// <summary>
/// Supports HTTP deflate using libz.
/// </summary>
LibZ = 0x08,
/// <summary>
/// Supports HTTP Ntlm (added in 7.10.6).
/// </summary>
Ntlm = 0x10,
/// <summary>
/// Supports HTTP GSS-Negotiate (added in 7.10.6).
/// </summary>
GssNegotiate = 0x20,
/// <summary>
/// libcurl was built with extra debug capabilities built-in. This
/// is mainly of interest for libcurl hackers. (added in 7.10.6)
/// </summary>
Debug = 0x40,
/// <summary>
/// libcurl was built with support for asynchronous name lookups,
/// which allows more exact timeouts (even on Windows) and less
/// blocking when using the multi interface. (added in 7.10.7)
/// </summary>
AsynchDns = 0x80,
/// <summary>
/// libcurl was built with support for Spnego authentication
/// (Simple and Protected GSS-API Negotiation Mechanism, defined
/// in RFC 2478.) (added in 7.10.8)
/// </summary>
Spnego = 0x100,
/// <summary>
/// libcurl was built with support for large files.
/// </summary>
LargeFile = 0x200,
/// <summary>
/// libcurl was built with support for IDNA, domain names with
/// international letters.
/// </summary>
Idn = 0x400
};
}

View File

@@ -1,29 +0,0 @@
Copyright (c) 2013, Masroor Ehsan Choudhury
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted provided that the following conditions are met:
* Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
* Neither the name of the {organization} nor the names of its
contributors may be used to endorse or promote products derived from
this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON
ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
https://github.com/masroore/CurlSharp

View File

@@ -1,619 +0,0 @@
/***************************************************************************
*
* CurlS#arp
*
* Copyright (c) 2013-2017 Dr. Masroor Ehsan (masroore@gmail.com)
* Portions copyright (c) 2004, 2005 Jeff Phillips (jeff@jeffp.net)
* Portions copyright (c) 2017 Katelyn Gigante (https://github.com/silasary)
*
* This software is licensed as described in the file LICENSE, which you
* should have received as part of this distribution.
*
* You may opt to use, copy, modify, merge, publish, distribute and/or sell
* copies of this Software, and permit persons to whom the Software is
* furnished to do so, under the terms of the LICENSE file.
*
* This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF
* ANY KIND, either express or implied.
*
**************************************************************************/
//#define USE_LIBCURLSHIM
using System;
using System.IO;
using System.Reflection;
using System.Runtime.InteropServices;
using CurlSharp.Enums;
namespace CurlSharp
{
/// <summary>
/// P/Invoke signatures.
/// </summary>
internal static unsafe class NativeMethods
{
private const string LIBCURL = "libcurl";
private const string LIBCURLSHIM = "libcurlshim";
private const string LIBC_LINUX = "libc";
private const string WINSOCK_LIB = "ws2_32.dll";
private const string LIB_DIR_WIN64 = "amd64";
private const string LIB_DIR_WIN32 = "i386";
static NativeMethods()
{
if (Environment.OSVersion.Platform == PlatformID.Win32NT)
{
if (Environment.Is64BitOperatingSystem)
{
SetDllDirectory(Path.Combine(AssemblyDirectory, LIB_DIR_WIN64));
}
else
{
SetDllDirectory(Path.Combine(AssemblyDirectory, LIB_DIR_WIN32));
}
}
#if USE_LIBCURLSHIM
if (!RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
throw new InvalidOperationException("Can not run on other platform than Win NET");
#endif
}
[DllImport("kernel32.dll", CharSet = CharSet.Unicode, SetLastError = true)]
[return: MarshalAs(UnmanagedType.Bool)]
private static extern bool SetDllDirectory(string lpPathName);
private static string AssemblyDirectory
{
get
{
var codeBase = typeof(NativeMethods).GetTypeInfo().Assembly.CodeBase;
var uri = new UriBuilder(codeBase);
var path = Uri.UnescapeDataString(uri.Path);
return Path.GetDirectoryName(path);
}
}
#region curl_global_init
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_global_init(int flags);
#endregion
#region curl_global_cleanup
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_global_cleanup();
#endregion
#region curl_easy_escape
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern IntPtr curl_easy_escape(IntPtr pEasy, string url, int length);
#endregion
#region curl_easy_unescape
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern IntPtr curl_easy_unescape(IntPtr pEasy, string url, int inLength, out int outLength);
#endregion
#region curl_free
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_free(IntPtr p);
#endregion
#region curl_version
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_version();
#endregion
#region curl_version_info
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_version_info(CurlVersion ver);
#endregion
#region curl_easy_init
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_easy_init();
#endregion
#region curl_easy_cleanup
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_easy_cleanup(IntPtr pCurl);
#endregion
#region curl_easy_setopt
#region Delegates
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int _CurlGenericCallback(IntPtr ptr, int sz, int nmemb, IntPtr userdata);
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int _CurlProgressCallback(
IntPtr extraData,
double dlTotal,
double dlNow,
double ulTotal,
double ulNow);
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int _CurlDebugCallback(
IntPtr ptrCurl,
CurlInfoType infoType,
string message,
int size,
IntPtr ptrUserData);
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate int _CurlSslCtxCallback(IntPtr ctx, IntPtr parm);
[UnmanagedFunctionPointer(CallingConvention.Cdecl)]
public delegate CurlIoError _CurlIoctlCallback(CurlIoCommand cmd, IntPtr parm);
#endregion
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, IntPtr parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, string parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, byte[] parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, long parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, bool parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, _CurlGenericCallback parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, _CurlProgressCallback parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, _CurlDebugCallback parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, _CurlSslCtxCallback parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_setopt(IntPtr pCurl, CurlOption opt, _CurlIoctlCallback parm);
#endregion
#region curl_easy_perform
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_perform(IntPtr pCurl);
#endregion
#region curl_easy_duphandle
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_easy_duphandle(IntPtr pCurl);
#endregion
#region curl_easy_strerror
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_easy_strerror(CurlCode err);
#endregion
#region curl_easy_getinfo
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_getinfo(IntPtr pCurl, CurlInfo info, ref IntPtr pInfo);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlCode curl_easy_getinfo(IntPtr pCurl, CurlInfo info, ref double dblVal);
#endregion
#region curl_easy_reset
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_easy_reset(IntPtr pCurl);
#endregion
#region curl_multi_init
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_multi_init();
#endregion
#region curl_multi_cleanup
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_cleanup(IntPtr pmulti);
#endregion
#region curl_multi_add_handle
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_add_handle(IntPtr pmulti, IntPtr peasy);
#endregion
#region curl_multi_remove_handle
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_remove_handle(IntPtr pmulti, IntPtr peasy);
#endregion
#region curl_multi_setopt
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_setopt(IntPtr pmulti, CurlMultiOption opt, bool parm);
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_setopt(IntPtr pmulti, CurlMultiOption opt, long parm);
#endregion
#region curl_multi_strerror
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_multi_strerror(CurlMultiCode errorNum);
#endregion
#region curl_multi_perform
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_perform(IntPtr pmulti, ref int runningHandles);
#endregion
#if !USE_LIBCURLSHIM
#region curl_multi_fdset
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_multi_fdset(IntPtr pmulti,
[In] [Out] ref fd_set read_fd_set,
[In] [Out] ref fd_set write_fd_set,
[In] [Out] ref fd_set exc_fd_set,
[In] [Out] ref int max_fd);
[StructLayout(LayoutKind.Sequential)]
public struct fd_set
{
public uint fd_count;
// [MarshalAs(UnmanagedType.ByValArray, SizeConst = FD_SETSIZE)] public IntPtr[] fd_array;
public fixed uint fd_array[FD_SETSIZE];
public const int FD_SETSIZE = 64;
public void Cleanup()
{
// fd_array = null;
}
public static fd_set Create()
{
return new fd_set
{
// fd_array = new IntPtr[FD_SETSIZE],
fd_count = 0
};
}
public static fd_set Create(IntPtr socket)
{
var handle = Create();
handle.fd_count = 1;
handle.fd_array[0] = (uint)socket;
return handle;
}
}
public static void FD_ZERO(fd_set fds)
{
for (var i = 0; i < fd_set.FD_SETSIZE; i++)
{
fds.fd_array[i] = 0;
}
fds.fd_count = 0;
}
#endregion
#region select
[StructLayout(LayoutKind.Sequential)]
public struct timeval
{
/// <summary>
/// Time interval, in seconds.
/// </summary>
public int tv_sec;
/// <summary>
/// Time interval, in microseconds.
/// </summary>
public int tv_usec;
public static timeval Create(int milliseconds)
{
return new timeval
{
tv_sec = milliseconds / 1000,
tv_usec = milliseconds % 1000 * 1000
};
}
}
[DllImport(LIBC_LINUX, EntryPoint = "select")]
private static extern int select_unix(
int nfds, // number of sockets, (ignored in winsock)
[In] [Out] ref fd_set readfds, // read sockets to watch
[In] [Out] ref fd_set writefds, // write sockets to watch
[In] [Out] ref fd_set exceptfds, // error sockets to watch
ref timeval timeout);
[DllImport(WINSOCK_LIB, EntryPoint = "select")]
private static extern int select_win(
int nfds, // number of sockets, (ignored in winsock)
[In] [Out] ref fd_set readfds, // read sockets to watch
[In] [Out] ref fd_set writefds, // write sockets to watch
[In] [Out] ref fd_set exceptfds, // error sockets to watch
ref timeval timeout);
public static int select(
int nfds, // number of sockets, (ignored in winsock)
[In] [Out] ref fd_set readfds, // read sockets to watch
[In] [Out] ref fd_set writefds, // write sockets to watch
[In] [Out] ref fd_set exceptfds, // error sockets to watch
ref timeval timeout)
{
int result;
if (Environment.OSVersion.Platform == PlatformID.Win32NT)
{
result = select_win(
nfds, // number of sockets, (ignored in winsock)
ref readfds, // read sockets to watch
ref writefds, // write sockets to watch
ref exceptfds, // error sockets to watch
ref timeout);
}
else
{
result = select_unix(
nfds, // number of sockets, (ignored in winsock)
ref readfds, // read sockets to watch
ref writefds, // write sockets to watch
ref exceptfds, // error sockets to watch
ref timeout);
}
return result;
}
#endregion
#endif
#region curl_share_init
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_share_init();
#endregion
#region curl_share_cleanup
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlShareCode curl_share_cleanup(IntPtr pShare);
#endregion
#region curl_share_strerror
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_share_strerror(CurlShareCode errorCode);
#endregion
#region curl_share_setopt
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlShareCode curl_share_setopt(
IntPtr pShare,
CurlShareOption optCode,
IntPtr option);
#endregion
#region curl_formadd
#if !USE_LIBCURLSHIM
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_formadd(ref IntPtr pHttppost, ref IntPtr pLastPost,
int codeFirst, IntPtr bufFirst,
int codeNext, IntPtr bufNext,
int codeLast);
#endif
#endregion
#region curl_formfree
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_formfree(IntPtr pForm);
#endregion
#region curl_slist_append
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern IntPtr curl_slist_append(IntPtr slist, string data);
#endregion
#region curl_slist_free_all
[DllImport(LIBCURL, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_slist_free_all(IntPtr pList);
#endregion
#if USE_LIBCURLSHIM
#region libcurlshim imports
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_initialize();
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_cleanup();
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_shim_alloc_strings();
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern IntPtr curl_shim_add_string_to_slist(IntPtr pStrings, string str);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern IntPtr curl_shim_get_string_from_slist(IntPtr pSlist, ref IntPtr pStr);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl, CharSet = CharSet.Ansi)]
public static extern IntPtr curl_shim_add_string(IntPtr pStrings, string str);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_free_strings(IntPtr pStrings);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_shim_install_delegates(
IntPtr pCurl,
IntPtr pThis,
_ShimWriteCallback pWrite,
_ShimReadCallback pRead,
_ShimProgressCallback pProgress,
_ShimDebugCallback pDebug,
_ShimHeaderCallback pHeader,
_ShimSslCtxCallback pCtx,
_ShimIoctlCallback pIoctl);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_cleanup_delegates(IntPtr pThis);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_get_file_time(
int unixTime,
ref int yy,
ref int mm,
ref int dd,
ref int hh,
ref int mn,
ref int ss);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_free_slist(IntPtr p);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_shim_alloc_fd_sets();
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_free_fd_sets(IntPtr fdsets);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern CurlMultiCode curl_shim_multi_fdset(IntPtr multi, IntPtr fdsets, ref int maxFD);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_shim_select(int maxFD, IntPtr fdsets, int milliseconds);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_shim_multi_info_read(IntPtr multi, ref int nMsgs);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_multi_info_free(IntPtr multiInfo);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_shim_formadd(IntPtr[] ppForms, IntPtr[] pParams, int nParams);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_shim_install_share_delegates(
IntPtr pShare,
IntPtr pThis,
_ShimLockCallback pLock,
_ShimUnlockCallback pUnlock);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern void curl_shim_cleanup_share_delegates(IntPtr pShare);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_shim_get_version_int_value(IntPtr p, int offset);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_shim_get_version_char_ptr(IntPtr p, int offset);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern int curl_shim_get_number_of_protocols(IntPtr p, int offset);
[DllImport(LIBCURLSHIM, CallingConvention = CallingConvention.Cdecl)]
public static extern IntPtr curl_shim_get_protocol_string(IntPtr p, int offset, int index);
public delegate void _ShimLockCallback(int data, int access, IntPtr userPtr);
public delegate void _ShimUnlockCallback(int data, IntPtr userPtr);
public delegate int _ShimDebugCallback(CurlInfoType infoType, IntPtr msgBuf, int msgBufSize, IntPtr parm);
public delegate int _ShimHeaderCallback(IntPtr buf, int sz, int nmemb, IntPtr stream);
public delegate CurlIoError _ShimIoctlCallback(CurlIoCommand cmd, IntPtr parm);
public delegate int _ShimProgressCallback(
IntPtr parm,
double dlTotal,
double dlNow,
double ulTotal,
double ulNow);
public delegate int _ShimReadCallback(IntPtr buf, int sz, int nmemb, IntPtr parm);
public delegate int _ShimSslCtxCallback(IntPtr ctx, IntPtr parm);
public delegate int _ShimWriteCallback(IntPtr buf, int sz, int nmemb, IntPtr parm);
#endregion
#endif
}
}

View File

@@ -1,114 +0,0 @@
CurlSharp
=========
CurlSharp is a .Net binding and object-oriented wrapper for [libcurl](http://curl.haxx.se/libcurl/).
libcurl is a web-client library that can provide cross-platform .Net applications with an easy way to implement such things as:
- HTTP ( GET / HEAD / PUT / POST / multi-part / form-data )
- FTP ( upload / download / list / 3rd-party )
- HTTPS, FTPS, SSL, TLS ( via OpenSSL or GnuTLS )
- Proxies, proxy tunneling, cookies, user+password authentication.
- File transfer resume, byte ranges, multiple asynchronous transfers.
- and much more...
CurlSharp provides simple get/set properties for libcurl's options and information functions, event-based hooks to libcurl's I/O, status, and progress callbacks, and wraps the c-style file I/O behind simple filename properties. The `CurlEasy` class contains has more than 100 different properties and methods to handle a wide variety of URL transfer requirements. While this may seem overwhelming at first glance, the good news is you will probably need only a tiny subset of these for most situations.
The CurlSharp library consists of these parts:
- Pure C# P/Invoke bindings to the libcurl API.
- Optional libcurlshim helper DLL [WIN32].
- The `CurlEasy` class which provides a wrapper around a `curl_easy` session.
- The `CurlMulti` class, which serves as a container for multiple CurlEasy objects, and provides a wrapper around a `curl_multi` session.
- The `CurlShare` class which provides an infrastructure for serializing access to data shared by multiple `CurlEasy` objects, including cookie data and DNS hosts. It implements the `curl_share_xxx` API.
- The `CurlHttpMultiPartForm` to easily construct multi-part forms.
- The `CurlSlist` class which wraps a linked list of strings used in cURL.
CurlSharp is available for these platforms:
- [Stable] Windows 32-bit
- [Experimental] Win64 port
- [Experimental] Mono Linux & OS X support
#### Examples ####
A simple HTTP download program...
```c#
using System;
using CurlSharp;
internal class EasyGet
{
public static void Main(String[] args)
{
Curl.GlobalInit(CurlInitFlag.All);
try
{
using (var easy = new CurlEasy())
{
easy.Url = "http://www.google.com/";
easy.WriteFunction = OnWriteData;
easy.Perform();
}
}
finally
{
Curl.GlobalCleanup();
}
}
public static Int32 OnWriteData(byte[] buf, Int32 size, Int32 nmemb, object data)
{
Console.Write(Encoding.UTF8.GetString(buf));
return size*nmemb;
}
}
```
Simple HTTP Post example:
```c#
using (var easy = new CurlEasy())
{
easy.Url = "http://hostname/testpost.php";
easy.Post = true;
var postData = "parm1=12345&parm2=Hello+world%21";
easy.PostFields = postData;
easy.PostFieldSize = postData.Length;
easy.Perform();
}
```
HTTP/2.0 download:
```c#
using (var easy = new CurlEasy())
{
easy.Url = "https://google.com/";
easy.WriteFunction = OnWriteData;
// HTTP/2 please
easy.HttpVersion = CurlHttpVersion.Http2_0;
// skip SSL verification during debugging
easy.SslVerifyPeer = false;
easy.SslVerifyhost = false;
easy.Perform();
}
```
More samples are included in the Samples folder.
#### Credits ####
CurlSharp Written by Dr. Masroor Ehsan.
CurlSharp is based on original code by Jeff Phillips [libcurl.NET](http://sourceforge.net/projects/libcurl-net/). Original code has been modified and greatly enhanced.
----------
CurlSharp Copyright © 2013-17 Dr. Masroor Ehsan

View File

@@ -1,7 +0,0 @@
namespace CurlSharp
{
public class SSLFix
{
public const string CipherList = "rsa_aes_128_sha,ecdhe_rsa_aes_256_sha,ecdhe_ecdsa_aes_128_sha";
}
}

View File

@@ -1,282 +0,0 @@
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Text;
using System.Threading.Tasks;
using CurlSharp;
using CurlSharp.Enums;
using Jackett.Common.Models.Config;
using Jackett.Common.Utils;
namespace Jackett.Common
{
public class CurlHelper
{
private static readonly object instance = new object();
public class CurlRequest
{
public string Url { get; private set; }
public string Cookies { get; private set; }
public string Referer { get; private set; }
public HttpMethod Method { get; private set; }
public IEnumerable<KeyValuePair<string, string>> PostData { get; set; }
public Dictionary<string, string> Headers { get; set; }
public string RawPOSTDdata { get; set; }
public CurlRequest(HttpMethod method, string url, string cookies = null, string referer = null, Dictionary<string, string> headers = null, string rawPOSTData = null)
{
Method = method;
Url = url.Replace(" ", "+"); // avoids bad request to cloudflare for urls containing a space followed by H (" H")
Cookies = cookies;
Referer = referer;
Headers = headers;
RawPOSTDdata = rawPOSTData;
}
}
public class CurlResponse
{
public List<string[]> HeaderList { get; private set; }
public byte[] Content { get; private set; }
public HttpStatusCode Status { get; private set; }
public string Cookies { set; get; }
public CurlResponse(List<string[]> headers, byte[] content, HttpStatusCode s, string cookies)
{
HeaderList = headers;
Content = content;
Status = s;
Cookies = cookies;
}
}
public static async Task<CurlResponse> GetAsync(string url, ServerConfig config, string cookies = null, string referer = null, Dictionary<string, string> headers = null)
{
var curlRequest = new CurlRequest(HttpMethod.Get, url, cookies, referer, headers);
var result = await PerformCurlAsync(curlRequest, config);
return result;
}
public static async Task<CurlResponse> PostAsync(string url, ServerConfig config, IEnumerable<KeyValuePair<string, string>> formData, string cookies = null, string referer = null, Dictionary<string, string> headers = null, string rawPostData = null)
{
var curlRequest = new CurlRequest(HttpMethod.Post, url, cookies, referer, headers);
curlRequest.PostData = formData;
curlRequest.RawPOSTDdata = rawPostData;
var result = await PerformCurlAsync(curlRequest, config);
return result;
}
public static async Task<CurlResponse> PerformCurlAsync(CurlRequest curlRequest, ServerConfig config)
{
return await Task.Run(() => PerformCurl(curlRequest, config));
}
public delegate void ErrorMessage(string s);
public static ErrorMessage OnErrorMessage;
public static CurlResponse PerformCurl(CurlRequest curlRequest, ServerConfig config)
{
lock (instance)
{
var headerBuffers = new List<byte[]>();
var contentBuffers = new List<byte[]>();
using (var easy = new CurlEasy())
{
easy.Url = curlRequest.Url;
easy.BufferSize = 64 * 1024;
easy.UserAgent = BrowserUtil.ChromeUserAgent;
easy.FollowLocation = false;
easy.ConnectTimeout = 20;
if (curlRequest.Headers != null)
{
CurlSlist curlHeaders = new CurlSlist();
foreach (var header in curlRequest.Headers)
{
curlHeaders.Append(header.Key + ": " + header.Value);
}
easy.SetOpt(CurlOption.HttpHeader, curlHeaders);
}
easy.WriteFunction = (byte[] buf, int size, int nmemb, object data) =>
{
contentBuffers.Add(buf);
return size * nmemb;
};
easy.HeaderFunction = (byte[] buf, int size, int nmemb, object extraData) =>
{
headerBuffers.Add(buf);
return size * nmemb;
};
if (!string.IsNullOrEmpty(curlRequest.Cookies))
easy.Cookie = curlRequest.Cookies;
if (!string.IsNullOrEmpty(curlRequest.Referer))
easy.Referer = curlRequest.Referer;
if (curlRequest.Method == HttpMethod.Post)
{
if (!string.IsNullOrEmpty(curlRequest.RawPOSTDdata))
{
easy.Post = true;
easy.PostFields = curlRequest.RawPOSTDdata;
easy.PostFieldSize = Encoding.UTF8.GetByteCount(curlRequest.RawPOSTDdata);
}
else
{
easy.Post = true;
var postString = StringUtil.PostDataFromDict(curlRequest.PostData);
easy.PostFields = postString;
easy.PostFieldSize = Encoding.UTF8.GetByteCount(postString);
}
}
if (config.RuntimeSettings.DoSSLFix == true)
{
// http://stackoverflow.com/questions/31107851/how-to-fix-curl-35-cannot-communicate-securely-with-peer-no-common-encryptio
// https://git.fedorahosted.org/cgit/mod_nss.git/plain/docs/mod_nss.html
easy.SslCipherList = SSLFix.CipherList;
easy.FreshConnect = true;
easy.ForbidReuse = true;
}
if (config.RuntimeSettings.IgnoreSslErrors == true)
{
easy.SetOpt(CurlOption.SslVerifyhost, false);
easy.SetOpt(CurlOption.SslVerifyPeer, false);
}
var proxy = config.GetProxyUrl();
if (proxy != null)
{
easy.SetOpt(CurlOption.HttpProxyTunnel, 1);
easy.SetOpt(CurlOption.Proxy, proxy);
var authString = config.GetProxyAuthString();
if (authString != null)
{
easy.SetOpt(CurlOption.ProxyUserPwd, authString);
}
}
easy.Perform();
if (easy.LastErrorCode != CurlCode.Ok)
{
var message = "Error " + easy.LastErrorCode.ToString() + " " + easy.LastErrorDescription + " " + easy.ErrorBuffer;
if (null != OnErrorMessage)
OnErrorMessage(message);
else
Console.WriteLine(message);
}
}
var headerBytes = Combine(headerBuffers.ToArray());
var headerString = Encoding.UTF8.GetString(headerBytes);
if (config.GetProxyUrl() != null)
{
var firstcrlf = headerString.IndexOf("\r\n\r\n");
var secondcrlf = headerString.IndexOf("\r\n\r\n", firstcrlf + 1);
if (secondcrlf > 0)
{
headerString = headerString.Substring(firstcrlf + 4, secondcrlf - (firstcrlf));
}
}
var headerParts = headerString.Split(new char[] { '\n', '\r' }, StringSplitOptions.RemoveEmptyEntries);
var headers = new List<string[]>();
var headerCount = 0;
HttpStatusCode status = HttpStatusCode.NotImplemented;
var cookieBuilder = new StringBuilder();
var cookies = new List<Tuple<string, string>>();
foreach (var headerPart in headerParts)
{
if (headerCount == 0)
{
var split = headerPart.Split(' ');
if (split.Length < 2)
throw new Exception("HTTP Header missing");
var responseCode = int.Parse(headerPart.Split(' ')[1]);
status = (HttpStatusCode)responseCode;
}
else
{
var keyVal = headerPart.Split(new char[] { ':' }, 2);
if (keyVal.Length > 1)
{
var key = keyVal[0].ToLower().Trim();
var value = keyVal[1].Trim();
if (key == "set-cookie")
{
var nameSplit = value.IndexOf('=');
if (nameSplit > -1)
{
var cKey = value.Substring(0, nameSplit);
var cVal = value.Split(';')[0] + ";";
cookies.Add(new Tuple<string, string>(cKey, cVal));
}
}
else
{
headers.Add(new[] { key, value });
}
}
}
headerCount++;
}
foreach (var cookieGroup in cookies.GroupBy(c => c.Item1))
{
cookieBuilder.AppendFormat("{0} ", cookieGroup.Last().Item2);
}
// add some debug output to track down the problem causing people getting InternalServerError results
if (status == HttpStatusCode.NotImplemented || status == HttpStatusCode.InternalServerError)
{
try
{
OnErrorMessage("got NotImplemented/InternalServerError");
OnErrorMessage("request.Method: " + curlRequest.Method);
OnErrorMessage("request.Url: " + curlRequest.Url);
OnErrorMessage("request.Cookies: " + curlRequest.Cookies);
OnErrorMessage("request.Referer: " + curlRequest.Referer);
OnErrorMessage("request.RawPOSTDdata: " + curlRequest.RawPOSTDdata);
OnErrorMessage("cookies: " + cookieBuilder.ToString().Trim());
OnErrorMessage("headerString:\n" + headerString);
foreach (var headerPart in headerParts)
{
OnErrorMessage("headerParts: " + headerPart);
}
}
catch (Exception ex)
{
OnErrorMessage(string.Format("CurlHelper: error while handling NotImplemented/InternalServerError:\n{0}", ex));
}
}
var contentBytes = Combine(contentBuffers.ToArray());
var curlResponse = new CurlResponse(headers, contentBytes, status, cookieBuilder.ToString().Trim());
return curlResponse;
}
}
public static byte[] Combine(params byte[][] arrays)
{
byte[] ret = new byte[arrays.Sum(x => x.Length)];
int offset = 0;
foreach (byte[] data in arrays)
{
Buffer.BlockCopy(data, 0, ret, offset, data.Length);
offset += data.Length;
}
return ret;
}
}
}

View File

@@ -1,7 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<dllmap dll="libcurl.dll" target="libcurl.so.4" />
<dllmap os="osx" dll="libcurl.dll" target="libcurl.4.dylib"/>
<!--<dllmap os="freebsd" dll="libcurl.dll" target="libcurl.so.4" />-->
<!--<dllmap os="solaris" dll="libcurl.dll" target="libcurl.so.4" />-->
</configuration>

View File

@@ -62,13 +62,13 @@
selector: a.view-torrent
attribute: href
size:
selector: td:nth-child(4)
seeders:
selector: td:nth-child(6)
leechers:
selector: td:nth-child(7)
grabs:
selector: td:nth-child(5)
seeders:
selector: td:nth-child(7)
leechers:
selector: td:nth-child(8)
grabs:
selector: td:nth-child(6)
filters:
- name: regexp
args: ([\d\.]+)

View File

@@ -0,0 +1,117 @@
---
site: btgigs
name: BTGigs
description: "BTGigs (TG) is a POLISH Private Torrent Tracker for MOVIES / TV / GENERAL"
language: pl-pl
type: private
encoding: ISO-8859-2
links:
- https://btgigs.info/
caps:
categorymappings:
- {id: 36, cat: Audio/Audiobook, desc: "aBooki"}
- {id: 27, cat: TV/Anime, desc: "Anime"}
- {id: 1, cat: PC, desc: "Aplikacje PC"}
- {id: 10, cat: Books/EBook, desc: "eBooki"}
- {id: 34, cat: Movies/BluRay, desc: "Filmy/BR"}
- {id: 4, cat: Movies/SD, desc: "Filmy/DVD-R"}
- {id: 31, cat: Movies/HD, desc: "Filmy/HD Rip"}
- {id: 17, cat: Movies/Other, desc: "Filmy/Inne"}
- {id: 35, cat: Movies/UHD, desc: "Filmy/UHD"}
- {id: 20, cat: Movies/SD, desc: "Filmy/XviD"}
- {id: 21, cat: Console, desc: "Gry/konsole"}
- {id: 7, cat: PC/Games, desc: "Gry/PC ISO"}
- {id: 12, cat: PC/Games, desc: "Gry/PC Rips"}
- {id: 28, cat: Other, desc: "GSM/PDA"}
- {id: 19, cat: Audio/Video, desc: "Koncerty/Teledyski"}
- {id: 32, cat: Audio/Lossless, desc: "musicDVD/DTS/FLAC"}
- {id: 5, cat: Audio/MP3, desc: "Muzyka/MP3"}
- {id: 26, cat: Other, desc: "Rozne"}
- {id: 30, cat: TV/Sport, desc: "Sport"}
- {id: 6, cat: TV, desc: "TV/Seriale"}
- {id: 29, cat: PC, desc: "Witaminki"}
- {id: 9, cat: XXX, desc: "XXX"}
modes:
search: [q]
tv-search: [q, season, ep]
movie-search: [q]
login:
path: /takelogin__akcja.php
method: post
inputs:
username_dupa: "{{ .Config.username }}"
password__dupa: "{{ .Config.password }}"
error:
- selector: td.embedded:has(h2:contains("failed"))
- selector: td.embedded:has(h2:contains("Error"))aD
test:
selector: a[href^="logout.php"]
path: /browse.php
search:
paths:
- path: /browse.php
inputs:
$raw: "{{range .Categories}}c{{.}}=1&{{end}}"
search: "{{ .Query.Keywords }}"
incldead: 1
tyt: 0
lang: 0
subcat: 0
rows:
selector: table[border="1"][cellpadding=5] > tbody > tr:has(a[href^="details.php?id="])
fields:
title:
selector: a[href^="details.php?id="]
details:
selector: a[href^="details.php?id="]
attribute: href
category:
selector: a[href^="browse.php?cat="]
attribute: href
filters:
- name: querystring
args: cat
download:
selector: a[href^="download.php/"]
attribute: href
description:
optional: true
selector: img[src^="/pic/cat_pl/"]
attribute: src
filters:
- name: append
args: "Language: polish\n<br>"
- name: prepend
args: {{ .Result.description }}
imdb:
optional: true
selector: a[href^="http://www.imdb.com/title/tt"]
date:
selector: td:nth-child(5)
filters:
- name: append
args: " +00:00"
- name: dateparse
args: "2006-01-0215:04:05 -07:00"
grabs:
selector: td:nth-child(7)
filters:
- name: regexp
args: (\d+)
size:
selector: td:nth-child(6)
seeders:
selector: td:nth-child(8)
leechers:
selector: td:nth-child(9)
downloadvolumefactor:
case:
"img[src=\"pic/ico_disk1.png\"]": 0
"img[src=\"pic/ico_disk2.png\"]": 1
"*": 1
uploadvolumefactor:
text: "1"

View File

@@ -67,6 +67,14 @@
args: [".*? / ", ""]
- name: diacritics
args: replace
- name: replace
args: ["1080i", "1080p"]
- name: replace
args: ["720i", "720p"]
- name: replace
args: ["pLQ", "p"]
- name: replace
args: ["pHD", "p"]
- name: replace
args: ["serie", ""]
- name: replace

View File

@@ -101,80 +101,50 @@
search:
paths:
- path: /index.php
keywordsfilters:
- name: re_replace
args: ["S[0-9]{2}([^E]|$)", ""] # remove season tag without episode (search doesn't support it)
- name: diacritics
args: replace
# most ITA TV torrents are in XXxYY format, so we search without S/E prefixes and filter later
- name: re_replace
args: ["S0?(\\d{1,2})", " $1 "]
- name: re_replace
args: ["E(\\d{2,3})", " $1 "]
inputs:
search: "{{ .Keywords }}"
category: "{{range .Categories}}{{.}};{{end}}"
page: "torrents"
active: 0
keywordsfilters:
- name: diacritics
args: replace
- name: re_replace # S01 to 1
args: ["(?i)\\bS0*(\\d+)\\b", "$1"]
- name: re_replace # S01E01 to 1 1
args: ["(?i)\\bS0*(\\d+)E0*(\\d+)\\b", "$1 $2"]
rows:
selector: div.b-content > table > tbody > tr > td > table.lista > tbody > tr:has(a[href^="index.php?page=torrent-details&id="])
#http://girotorrent.org/index.php?page=torrent-details&id=73d93dccf84ea3a8b614a3113acfd9eea186d730
filters:
- name: andmatch
fields:
download:
selector: a[href^="index.php?page=downloadcheck&id="]
attribute: href
title: # shortened title?
title:
selector: a[onmouseover][href^="index.php?page=torrent-details&id="]
# normalize to SXXEYY format
filters:
- name: re_replace # replace special characters with " " (space)
args: ["[^a-zA-Z0-9]|\\.", " "]
args: ["[^a-zA-Z0-9\\s]|\\.", " "]
- name: re_replace # replace multiple spaces
args: ["[ ]{2,}", " "]
# normalize to SXXEYY format
- name: re_replace
args: ["(\\d{2})x(\\d{2})", "S$1E$2"]
- name: re_replace
args: ["(\\d{1})x(\\d{2})", "S0$1E$2"]
- name: re_replace #Stagione X --> S0X
args: ["Stagione (\\d{0,1}\\s)", "S0$1"]
- name: re_replace #Stagione XX --> SXX
args: ["Stagione (\\d{2}\\s)", "S$1"]
- name: re_replace #/ Episodio [YY-YY --> EYY-YY
args: ["(\\s\\/\\sEpisodio|\\s\\/\\sEpisodi|\\sEpisodio|\\s\\|\\sEpisodio|\\sEpisodi)\\s\\[", "E"]
- name: re_replace #/ Completa [episodi YY-YY --> EYY-YY
args: ["(\\s\\/\\sCompleta\\s\\[episodi\\s)", "E"]
- name: re_replace #remove di YY] | remove /YY]
args: ["(\\sdi\\s\\d{1,2}|\\/\\d{1,2})\\]", " "]
- name: re_replace #remove various
args: ["(Serie completa|Completa|\\[in pausa\\])", ""]
# fine prova
title: # long titles?
optional: true
selector: a[title][href^="index.php?page=torrent-details"]
attribute: title
filters:
- name: replace
args: ["Vedi Dettagli: ", ""]
# inizio prova
- name: re_replace # replace special characters with " " (space)
args: ["[^a-zA-Z0-9]|\\.", " "]
# normalize to SXXEYY format
- name: re_replace
args: ["(\\d{2})x(\\d{2})", "S$1E$2"]
- name: re_replace
args: ["(\\d{1})x(\\d{2})", "S0$1E$2"]
- name: re_replace #Stagione X --> S0X
args: ["Stagione (\\d{0,1}\\s)", "S0$1"]
- name: re_replace #Stagione XX --> SXX
args: ["Stagione (\\d{2}\\s)", "S$1"]
- name: re_replace #/ Episodio [YY-YY --> EYY-YY
args: ["(\\s\\/\\sEpisodio|\\s\\/\\sEpisodi|\\sEpisodio|\\s\\|\\sEpisodio|\\sEpisodi)\\s\\[", "E"]
- name: re_replace #/ Completa [episodi YY-YY --> EYY-YY
args: ["(\\s\\/\\sCompleta\\s\\[episodi\\s)", "E"]
- name: re_replace #remove di YY] | remove /YY]
args: ["(\\sdi\\s\\d{1,2}|\\/\\d{1,2})\\]", " "]
- name: re_replace #remove various
args: ["(Serie completa|Completa|\\[in pausa\\])", ""]
# fine prova
- name: re_replace # S01 E01 to S01E01
args: ["(?i)\\bS(\\d+)\\sE(\\d+)\\b", "S$1E$2"]
- name: re_replace # 01x01 to S01E01
args: ["(?i)(\\d{2})x(\\d+)", "S$1E$2"]
- name: re_replace # 1x01 to S01E01
args: ["(?i)\\b(\\d{1})x(\\d+)", "S0$1E$2"]
- name: re_replace # Stagione X --> S0X
args: ["(?i)\\bStagion[ei]\\s?(\\d{1})\\b|\\bSeason'?s?\\s?(\\d{1})\\b", "S0$1$2"]
- name: re_replace # Stagione XX --> SXX
args: ["(?i)\\bStagion[ei]\\s?(\\d{2,})\\b|\\bSeason'?s?\\s?(\\d{2,})\\b", "S$1$2"]
- name: re_replace # Episodio 4 to E4
args: ["(?i)\\b(?:[\\/\\|]?Episodio\\s?(\\d+)|Puntata\\s?(\\d+))", "E$1$2"]
- name: re_replace # Episodi 4 5 to E04-05
args: ["(?i)\\b(?:Puntate\\s*)(\\d+)\\s?(\\d+)", "E0$1-0$2"]
- name: re_replace # rimozioni varie
args: ["(?i)(Serie completa|Completat?a?|in pausa)", ""]
category:
selector: a[href^="index.php?page=torrents&category="]
attribute: href

View File

@@ -0,0 +1,130 @@
---
site: hqsource
name: HQSource
description: "HQSource (HQS) is a POLISH Private Torrent Tracker for MOVIES / TV / GENERAL"
language: pl-pl
type: private
encoding: ISO-8859-2
links:
- http://hqsource.org/
caps:
categorymappings:
- {id: 36, cat: Movies/3D, desc: "3D"}
- {id: 3, cat: Movies/UHD, desc: "4K/UHD"}
- {id: 2, cat: Movies/HD, desc: "BDRip"}
- {id: 1, cat: Movies/HD, desc: "BRRip"}
- {id: 49, cat: Movies/BluRay, desc: "BluRay"}
- {id: 8, cat: Movies/SD, desc: "DVD"}
- {id: 4, cat: TV/HD, desc: "HDTV"}
- {id: 7, cat: Movies/SD, desc: "HQDVDRip"}
- {id: 45, cat: Movies/HD, desc: "MKV"}
- {id: 11, cat: Audio, desc: "Music"}
- {id: 6, cat: Other, desc: "Special"}
- {id: 46, cat: PC, desc: "Tools"}
- {id: 9, cat: TV, desc: "TV-Series"}
- {id: 5, cat: Movies, desc: "WEB-DL"}
- {id: 35, cat: XXX, desc: "XXX"}
modes:
search: [q]
tv-search: [q, season, ep]
movie-search: [q]
settings:
- name: username
type: text
label: Username
- name: password
type: password
label: Password
- name: pin
type: text
label: Pin
login:
path: /takelogin.php
method: post
inputs:
username: "{{ .Config.username }}"
password: "{{ .Config.password }}"
pin: "{{ .Config.pin }}"
returnto: "/"
error:
- selector: td.embedded:has(h2:contains("failed"))
- selector: td.embedded:has(h2:contains("Error"))
test:
selector: a[href^="logout.php"]
path: /browse.php
search:
paths:
- path: /browse.php
inputs:
$raw: "{{range .Categories}}c{{.}}=1&{{end}}"
search: "{{ .Query.Keywords }}"
incldead: 1
polish: 0
blah: 0
rows:
selector: table#line > tbody > tr:has(a[href^="details.php?id="])
fields:
title:
selector: a[href^="details.php?id="]
details:
selector: a[href^="details.php?id="]
attribute: href
category:
selector: a[href^="browse.php?cat="]
attribute: href
filters:
- name: querystring
args: cat
download:
selector: a[href^="download.php/"]
attribute: href
description:
optional: true
selector: img[src="pic/pl.png"]
filters:
- name: append
args: "Language: polish\n<br>"
- name: prepend
args: {{ .Result.description }}
description:
optional: true
selector: img[src="pic/napisy.png"]
filters:
- name: append
args: "Subbed\n<br>"
- name: prepend
args: {{ .Result.description }}
imdb:
optional: true
selector: a[href^="http://www.imdb.com/title/tt"]
grabs:
selector: td:nth-child(6)
filters:
- name: regexp
args: (\d+)
size:
selector: td:nth-child(5)
date:
selector: tr, br
filters:
- name: append
args: " +00:00"
- name: dateparse
args: "2006-01-0215:04:05 -07:00"
seeders:
selector: td:nth-child(7)
leechers:
selector: td:nth-child(8)
downloadvolumefactor:
case:
"img[src=\"pic/download2.gif\"]": 0
"*": 1
uploadvolumefactor:
case:
"img[src=\"pic/double.png\"]": 2
"*": 1

View File

@@ -6,6 +6,8 @@
type: public
encoding: UTF-8
links:
- https://idope.cc/
legacylinks:
- https://idope.se/
caps:

View File

@@ -53,15 +53,12 @@
paths:
- path: /index.php
keywordsfilters:
- name: re_replace
args: ["S[0-9]{2}([^E]|$)", ""] # remove season tag without episode (search doesn't support it)
- name: diacritics
args: replace
# most ITA TV torrents are in XXxYY format, so we search without S/E prefixes and filter later
- name: re_replace
args: ["S0?(\\d{1,2})", " $1 "]
- name: re_replace
args: ["E(\\d{2,3})", " $1 "]
- name: re_replace # S01 to 1
args: ["(?i)\\bS0*(\\d+)\\b", "$1"]
- name: re_replace # S01E01 to 1 1
args: ["(?i)\\bS0*(\\d+)E0*(\\d+)\\b", "$1 $2"]
inputs:
search: "{{ .Keywords }}"
category: "{{range .Categories}}{{.}};{{end}}"
@@ -69,31 +66,33 @@
active: 0
rows:
selector: div.b-content > table > tbody > tr > td > table.lista > tbody > tr:has(a[href^="index.php?page=torrents&category="])
filters:
- name: andmatch
fields:
title:
selector: td:nth-child(2) > a
# normalize to SXXEYY format
filters:
- name: re_replace # replace special characters with " " (space)
args: ["[^a-zA-Z0-9]|\\.", " "]
args: ["[^a-zA-Z0-9\\s]|\\.", " "]
- name: re_replace # replace multiple spaces
args: ["[ ]{2,}", " "]
# normalize to SXXEYY format
- name: re_replace
args: ["(\\d{2})x(\\d{2})", "S$1E$2"]
- name: re_replace
args: ["(\\d{1})x(\\d{2})", "S0$1E$2"]
- name: re_replace #Stagione X --> S0X
args: ["Stagione (\\d{0,1}\\s)", "S0$1"]
- name: re_replace #Stagione XX --> SXX
args: ["Stagione (\\d{2}\\s)", "S$1"]
- name: re_replace #/ Episodio [YY-YY --> EYY-YY
args: ["(\\s\\/\\sEpisodio|\\s\\/\\sEpisodi|\\sEpisodio|\\s\\|\\sEpisodio|\\sEpisodi)\\s\\[", "E"]
- name: re_replace #/ Completa [episodi YY-YY --> EYY-YY
args: ["(\\s\\/\\sCompleta\\s\\[episodi\\s)", "E"]
- name: re_replace #remove di YY] | remove /YY]
args: ["(\\sdi\\s\\d{1,2}|\\/\\d{1,2})\\]", " "]
- name: re_replace #remove various
args: ["(Serie completa|Completa|\\[in pausa\\])", ""]
# fine prova
- name: re_replace # S01 E01 to S01E01
args: ["(?i)\\bS(\\d+)\\sE(\\d+)\\b", "S$1E$2"]
- name: re_replace # 01x01 to S01E01
args: ["(?i)(\\d{2})x(\\d+)", "S$1E$2"]
- name: re_replace # 1x01 to S01E01
args: ["(?i)\\b(\\d{1})x(\\d+)", "S0$1E$2"]
- name: re_replace # Stagione X --> S0X
args: ["(?i)\\bStagion[ei]\\s?(\\d{1})\\b|\\bSeason'?s?\\s?(\\d{1})\\b", "S0$1$2"]
- name: re_replace # Stagione XX --> SXX
args: ["(?i)\\bStagion[ei]\\s?(\\d{2,})\\b|\\bSeason'?s?\\s?(\\d{2,})\\b", "S$1$2"]
- name: re_replace # Episodio 4 to E4
args: ["(?i)\\b(?:[\\/\\|]?Episodio\\s?(\\d+)|Puntata\\s?(\\d+))", "E$1$2"]
- name: re_replace # Episodi 4 5 to E04-05
args: ["(?i)\\b(?:Puntate\\s*)(\\d+)\\s?(\\d+)", "E0$1-0$2"]
- name: re_replace # rimozioni varie
args: ["(?i)(Serie completa|Completat?a?|in pausa)", ""]
download: # handle torrents with normal torrent file download
optional: true
selector: a[href^="download.php?id="]
@@ -135,7 +134,7 @@
- name: querystring
args: category
details:
selector: td:nth-child(2) a
selector: td:nth-child(2) > a
attribute: href
banner:
optional: true

View File

@@ -31,32 +31,44 @@
- name: itorrents-links
type: checkbox
label: Add download links via itorrents.org
- name: advanced-search
type: checkbox
label: Use the advanced search of IlCorsaroNero (experimental)
# - name: advanced-search
# type: checkbox
# label: Use the advanced search of IlCorsaroNero (experimental)
search:
paths:
# https://ilcorsaronero.info/advsearch.php?&category=15&search=flash+4&&order=data&by=DESC&page=3
# {{range .Categories}}{{.}};{{end}}
##### Are the "not" and "and" functions implemented? Or am I doing it wrong?
# path: "{{if and .Query.Keywords .advanced-search}}adv/{{ .Query.Keywords}}.html
# {{else if and .Query.Keywords (not .advanced-search)}}argh.php?search={{ .Query.Keywords}}
# {{else}}/recenti
# {{end}}"
- path: "{{if .Keywords}}argh.php?search={{ .Keywords}}
{{else}}/recenti
{{end}}"
# - path: "{{if .Keywords}}argh.php?search={{ .Keywords}}
# {{else}}/recenti
# {{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=0{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=1{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=2{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=3{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=4{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=5{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=6{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=7{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=8{{else}}/recenti{{end}}"
- path: "{{if .Keywords}}advsearch.php?&category={{range .Categories}}{{.}};{{end}}&search={{ .Keywords}}&order=data&by=DESC&page=9{{else}}/recenti{{end}}"
keywordsfilters:
- name: re_replace
args: ["S[0-9]{2}([^E]|$)", ""] # remove season tag without episode (search doesn't support it)
- name: diacritics
args: replace
# most ITA TV torrents are in XXxYY format, so we search without S/E prefixes and filter later
- name: re_replace
args: ["S0?(\\d{1,2})", " $1 "]
- name: re_replace
args: ["E(\\d{2,3})", " $1 "]
- name: re_replace # S01 to 1
args: ["(?i)\\bS0*(\\d+)\\b", "$1"]
- name: re_replace # S01E01 to 1 1
args: ["(?i)\\bS0*(\\d+)E0*(\\d+)\\b", "$1 $2"]
rows:
selector: "tr.odd,tr.odd2"
filters:
- name: andmatch
fields:
title:
selector: td:nth-child(2) a.tab
@@ -65,32 +77,27 @@
- name: split
args: [ "/", -1 ]
- name: urldecode
- name: re_replace
args: [ "_+", " "]
- name: replace
args: [ ".", " "]
- name: re_replace
args: [ "\\s{2,}", " "]
- name: re_replace # replace special characters with " " (space)
args: ["[^a-zA-Z0-9\\s]|\\.", " "]
- name: re_replace # replace multiple spaces
args: ["[ ]{2,}", " "]
# normalize to SXXEYY format
- name: re_replace
args: ["(\\d{2})x(\\d{2})", "S$1E$2"]
- name: re_replace
args: ["(\\d{1})x(\\d{2})", "S0$1E$2"]
- name: re_replace #Stagione X --> S0X
args: ["Stagione (\\d{0,1}\\s)", "S0$1"]
- name: re_replace #Stagione XX --> SXX
args: ["Stagione (\\d{2}\\s)", "S$1"]
- name: re_replace #/ Episodio [YY-YY --> EYY-YY
args: ["(\\s\\/\\sEpisodio|\\s\\/\\sEpisodi|\\sEpisodio|\\s\\|\\sEpisodio|\\sEpisodi)\\s\\[", "E"]
- name: re_replace #/ Completa [episodi YY-YY --> EYY-YY
args: ["(\\s\\/\\sCompleta\\s\\[episodi\\s)", "E"]
- name: re_replace #remove di YY] | remove /YY]
args: ["(\\sdi\\s\\d{1,2}|\\/\\d{1,2})\\]", " "]
- name: re_replace #remove various
args: ["(Serie completa|Completa|\\[in pausa\\])", ""]
# fine prova
- name: re_replace #try to find multi episode
args: ["(S\\d{2}E\\d{2})\\s(\\d{2})", "$1-$2"]
- name: re_replace # S01 E01 to S01E01
args: ["(?i)\\bS(\\d+)\\sE(\\d+)\\b", "S$1E$2"]
- name: re_replace # 01x01 to S01E01
args: ["(?i)(\\d{2})x(\\d+)", "S$1E$2"]
- name: re_replace # 1x01 to S01E01
args: ["(?i)\\b(\\d{1})x(\\d+)", "S0$1E$2"]
- name: re_replace # Stagione X --> S0X
args: ["(?i)\\bStagion[ei]\\s?(\\d{1})\\b|\\bSeason'?s?\\s?(\\d{1})\\b", "S0$1$2"]
- name: re_replace # Stagione XX --> SXX
args: ["(?i)\\bStagion[ei]\\s?(\\d{2,})\\b|\\bSeason'?s?\\s?(\\d{2,})\\b", "S$1$2"]
- name: re_replace # Episodio 4 to E4
args: ["(?i)\\b(?:[\\/\\|]?Episodio\\s?(\\d+)|Puntata\\s?(\\d+))", "E$1$2"]
- name: re_replace # Episodi 4 5 to E04-05
args: ["(?i)\\b(?:Puntate\\s*)(\\d+)\\s?(\\d+)", "E0$1-0$2"]
- name: re_replace # rimozioni varie
args: ["(?i)(Serie completa|Completat?a?|in pausa)", ""]
category:
selector: td:nth-child(1) a
attribute: href

View File

@@ -6,8 +6,9 @@
type: public
encoding: UTF-8
links:
- https://www.limetorrents.io/
- https://www.limetorrents.me/
legacylinks:
- https://www.limetorrents.io/
- https://www.limetorrents.cc/
caps:

View File

@@ -1,90 +0,0 @@
---
site: nexttorrent
name: NextTorrent
description: "NextTorrent is a FRENCH Public site for TV / MOVIES / GENERAL"
language: fr-fr
type: public
encoding: UTF-8
links:
- http://www.nextorrent.tv/
legacylinks:
- https://www.nextorrent.site/
- http://www.nextorrent.site/
- http://www.nextorrent.bz/
- http://www.nextorrent.pro/
- https://www.nextorrent.cc/
- https://www.nextorrent.org/
- https://www.nextorrent.tv/
caps:
categorymappings:
- {id: Films, cat: Movies, desc: "Movies"}
- {id: Séries, cat: TV, desc: "TV"}
- {id: Jeux-PC, cat: PC/Games, desc: "Games PC"}
- {id: Jeux-Consoles, cat: Console, desc: "Games Console"}
- {id: Musique, cat: Audio, desc: "Music"}
- {id: Ebook, cat: Books, desc: "EBooks"}
- {id: Logiciels, cat: PC, desc: "Software"}
modes:
search: [q]
tv-search: [q, season, ep]
movie-search: [q]
settings: []
download:
selector: a[href^="/get_torrent/"]
search:
paths:
- path: "recherche/{{ .Query.Keywords }}"
rows:
selector: div.listing-torrent > table tbody tr:has(a)
fields:
site_date:
selector: td:nth-child(1) a
filters:
# date is at the end of the title, so we get it and name it site_date
- name: regexp
args: "(\\w+)$"
title:
selector: td:nth-child(1) a
filters:
# now we put the date at the right place according scene naming rules using .Result.site_date
- name: replace
args: ["FRENCH", "{{ .Result.site_date }} FRENCH"]
- name: replace
args: ["TRUEFRENCH", "{{ .Result.site_date }} TRUEFRENCH"]
- name: replace
args: ["VOSTFR", "{{ .Result.site_date }} VOSTFR"]
# and we delete it at the end
- name: re_replace
args: ["(\\w+)$", ""]
details:
selector: td:nth-child(1) a
attribute: href
download:
selector: td:nth-child(1) a
attribute: href
category:
selector: td:nth-child(1) i
attribute: class
size:
selector: td:nth-child(2)
date:
text: now
seeders:
text: 0
seeders:
optional: true
selector: td:nth-child(3)
leechers:
text: 0
leechers:
optional: true
selector: td:nth-child(4)
downloadvolumefactor:
text: "0"
uploadvolumefactor:
text: "1"

View File

@@ -0,0 +1,121 @@
---
site: rstorrent
name: RedStarTorrent
description: "Red Star Torrent (RST) is a POLISH Private Torrent Tracker for TV"
language: pl-pl
type: private
encoding: ISO-8859-2
links:
- http://rstorrent.org.pl/
caps:
categorymappings:
- {id: 34, cat: PC/0day, desc: "0-day"}
- {id: 15, cat: Movies/3D, desc: "3D"}
- {id: 23, cat: TV/Anime, desc: "Anime"}
- {id: 1, cat: PC, desc: "Aplikacje"}
- {id: 30, cat: Books/EBook, desc: "Ebooki"}
- {id: 20, cat: Movies/SD, desc: "Filmy/DVD-R"}
- {id: 5, cat: Movies/HD, desc: "Filmy/HD"}
- {id: 19, cat: Movies/SD, desc: "Filmy/XviD"}
- {id: 4, cat: PC/Games, desc: "Gry/PC ISO"}
- {id: 28, cat: Other, desc: "GSM/PDA"}
- {id: 29, cat: Movies, desc: "Kids"}
- {id: 40, cat: Movies/Foreign, desc: "Kino Polska"}
- {id: 6, cat: Audio, desc: "Muzyka"}
- {id: 12, cat: TV, desc: "Paczka"}
- {id: 25, cat: Other, desc: "Rozne"}
- {id: 7, cat: TV, desc: "Seriale - Epizody"}
- {id: 3, cat: TV, desc: "Seriale - Sezony"}
- {id: 35, cat: TV, desc: "SHOW"}
- {id: 26, cat: TV/Sport, desc: "Sport"}
- {id: 36, cat: Other, desc: "Teatr"}
- {id: 27, cat: Audio/Video, desc: "Teledyski"}
- {id: 31, cat: TV/Documentary, desc: "TV Doc"}
- {id: 9, cat: XXX, desc: "XXX"}
modes:
search: [q]
tv-search: [q, season, ep]
movie-search: [q]
login:
path: /takelogin.php
method: post
inputs:
username: "{{ .Config.username }}"
password: "{{ .Config.password }}"
error:
- selector: td.embedded:has(h2:contains("failed"))
- selector: td.embedded:has(h2:contains("Error"))aD
test:
selector: a[href^="logout.php"]
path: /browse.php
search:
paths:
- path: /browse.php
inputs:
$raw: "{{range .Categories}}c{{.}}=1&{{end}}"
search: "{{ .Query.Keywords }}"
incldead: 1
polish: 0
rows:
selector: table[border="1"][cellpadding="5"] > tbody > tr:has(a[href^="/details.php?id="])
fields:
title:
selector: a[href^="/details.php?id="]
details:
selector: a[href^="/details.php?id="]
attribute: href
category:
selector: a[href^="/browse.php?cat="]
attribute: href
filters:
- name: querystring
args: cat
download:
selector: a[href^="/download.php/"]
attribute: href
description:
optional: true
selector: img[src="/pic/pl.gif"]
attribute: src
filters:
- name: append
args: "Language: polish\n<br>"
- name: prepend
args: {{ .Result.description }}
imdb:
optional: true
selector: a[href^="http://www.imdb.com/title/tt"]
date:
selector: td:nth-child(5)
filters:
- name: append
args: " +00:00"
- name: dateparse
args: "2006-01-0215:04:05 -07:00"
grabs:
selector: td:nth-child(7)
filters:
- name: regexp
args: (\d+)
size:
selector: td:nth-child(6)
seeders:
selector: td:nth-child(8)
filters:
- name: regexp
args: ^(\d+)
leechers:
selector: td:nth-child(8)
filters:
- name: regexp
args: / (\d+)
downloadvolumefactor:
case:
td.darmowy: 0
"*": 1
uploadvolumefactor:
text: "1"

View File

@@ -13,50 +13,50 @@
caps:
categorymappings:
# Vip
- {id: 34, cat: XXX, desc: "Adulti"}
- {id: 46, cat: Other, desc: "IPTV"}
- {id: 57, cat: XXX, desc: "Riviste XXX"}
- {id: 58, cat: XXX, desc: "Fumetti XXX"}
# Applicazioni
- {id: 33, cat: PC/Phone-Android, desc: "Android"}
- {id: 8, cat: PC/0day, desc: "Linux"}
- {id: 9, cat: PC/Mac, desc: "Mac"}
- {id: 7, cat: PC/0day, desc: "PC"}
# Books
- {id: 43, cat: Books, desc: "Libreria"}
- {id: 41, cat: Books, desc: "Quotidiani"}
- {id: 59, cat: Books, desc: "Fumetti"}
- {id: 60, cat: Books, desc: "Riviste"}
- {id: 61, cat: Books, desc: "Audiolibri"}
# Games
- {id: 47, cat: PC/Games, desc: "Games PC"}
- {id: 40, cat: Console/Other, desc: "Nintendo"}
- {id: 13, cat: Console/PS4, desc: "Sony PS"}
- {id: 33, cat: Console/Xbox, desc: "XboX"}
- {id: 14, cat: Console/Wii, desc: "Wii"}
# Movie
- {id: 21, cat: Movies/DVD, desc: "Movie DVD-9"}
- {id: 11, cat: Movies/DVD, desc: "Movie DVD-5"}
- {id: 20, cat: Movies/SD, desc: "Movie DVDRip"}
- {id: 22, cat: Movies/UHD, desc: "Movie 4K-Ultra-HD"}
- {id: 23, cat: Movies/HD, desc: "Movie H-265"}
- {id: 24, cat: Movies/HD, desc: "Movie 1080p"}
- {id: 25, cat: Movies/HD, desc: "Movie 720p"}
- {id: 26, cat: Movies/3D, desc: "Movie 3D-FullHD"}
- {id: 27, cat: Movies/BluRay, desc: "Movie Blu Ray Disk"}
- {id: 43, cat: Movies/SD, desc: "BMovie DRip"}
- {id: 29, cat: Movies/SD, desc: "Movie Cine News"}
- {id: 30, cat: TV/HD, desc: "Serie Tv HD"}
- {id: 31, cat: TV/SD, desc: "Serie Tv SD"}
- {id: 35, cat: TV/Other, desc: "Programmi TV"}
- {id: 42, cat: TV/Documentary, desc: "Documentari"}
# Music
- {id: 54, cat: Audio/MP3, desc: "MP3"}
- {id: 55, cat: Audio/Lossless, desc: "Flac"}
# Movies
- {id: 17, cat: Movies/SD, desc: "Cine News"}
- {id: 43, cat: Movies/SD, desc: "BDRip"}
- {id: 16, cat: Movies/SD, desc: "DivX"}
- {id: 20, cat: Movies/SD, desc: "DVDRip"}
- {id: 21, cat: Movies/DVD, desc: "DVD"}
- {id: 25, cat: Movies/HD, desc: "720p"}
- {id: 24, cat: Movies/HD, desc: "1080p"}
- {id: 27, cat: Movies/BluRay, desc: "Blu Ray Disk"}
- {id: 23, cat: Movies/HD, desc: "H-265"}
- {id: 26, cat: Movies/3D, desc: "3D-FullHD"}
- {id: 31, cat: TV/SD, desc: "SerieTV"}
- {id: 45, cat: TV/HD, desc: "Serie Tv HD"}
- {id: 22, cat: Movies/UHD, desc: "4K-Ultra-HD"}
- {id: 49, cat: TV/Documentary, desc: "Documentari"}
- {id: 50, cat: TV/Other, desc: "Programmi TV"}
- {id: 51, cat: Movies/Other, desc: "Mp4"}
- {id: 36, cat: Audio/MP3, desc: "Music MP3"}
- {id: 37, cat: Audio/Lossless, desc: "Music Flac"}
# Games
- {id: 18, cat: PC/Games, desc: "Games PC"}
- {id: 19, cat: Console/PS3, desc: "Games PS3"}
- {id: 33, cat: Console/Xbox, desc: "Games XboX"}
- {id: 39, cat: Console/Wii, desc: "Games Wii"}
- {id: 40, cat: Console/Other, desc: "Games Nintendo"}
# Anime
- {id: 5, cat: TV/Anime, desc: "Anime"}
# Edicola
- {id: 16, cat: Books, desc: "Edicola Quotidiani"}
- {id: 28, cat: Books, desc: "Edicola Libri"}
- {id: 17, cat: Books, desc: "Edicola Riviste"}
- {id: 41, cat: Books, desc: "Edicola Fumetti"}
# Applicazioni
- {id: 7, cat: PC/0day, desc: "Applicazioni PC"}
- {id: 8, cat: PC/0day, desc: "Applicazioni Linux"}
- {id: 9, cat: PC/Mac, desc: "Applicazioni Mac"}
- {id: 32, cat: PC/Phone-Android, desc: "Applicazioni Android"}
- {id: 34, cat: PC/Phone-IOS, desc: "Applicazioni Iphone"}
# Vip
- {id: 38, cat: Other, desc: "V.I.P."}
# Adult
- {id: 13, cat: XXX, desc: "Riviste XXX"}
- {id: 14, cat: XXX, desc: "Fumetti XXX"}
- {id: 44, cat: XXX, desc: "Adulti"}
modes:
search: [q]
@@ -79,15 +79,12 @@
paths:
- path: /index.php
keywordsfilters:
- name: re_replace
args: ["S[0-9]{2}([^E]|$)", ""] # remove season tag without episode (search doesn't support it)
- name: diacritics
args: replace
# most ITA TV torrents are in XXxYY format, so we search without S/E prefixes and filter later
- name: re_replace
args: ["S0?(\\d{1,2})", " $1 "]
- name: re_replace
args: ["E(\\d{2,3})", " $1 "]
- name: re_replace # S01 to 1
args: ["(?i)\\bS0*(\\d+)\\b", "$1"]
- name: re_replace # S01E01 to 1 1
args: ["(?i)\\bS0*(\\d+)E0*(\\d+)\\b", "$1 $2"]
inputs:
search: "{{ .Keywords }}"
category: "{{range .Categories}}{{.}};{{end}}"
@@ -95,62 +92,62 @@
active: 0
rows:
selector: div.b-content > table > tbody > tr > td > table.lista > tbody > tr:has(a[href^="index.php?page=torrents&category="])
filters:
- name: andmatch
fields:
download:
selector: a[href^="download.php?id="]
attribute: href
title: # shortened title?
selector: a[href^="index.php?page=torrent-details"]
# normalize to SXXEYY format
filters:
- name: re_replace # replace special characters with " " (space)
args: ["[^a-zA-Z0-9]|\\.", " "]
args: ["[^a-zA-Z0-9\\s]|\\.", " "]
- name: re_replace # replace multiple spaces
args: ["[ ]{2,}", " "]
# normalize to SXXEYY format
- name: re_replace
args: ["(\\d{2})x(\\d{2})", "S$1E$2"]
- name: re_replace
args: ["(\\d{1})x(\\d{2})", "S0$1E$2"]
- name: re_replace #Stagione X --> S0X
args: ["Stagione (\\d{0,1}\\s)", "S0$1"]
- name: re_replace #Stagione XX --> SXX
args: ["Stagione (\\d{2}\\s)", "S$1"]
- name: re_replace #/ Episodio [YY-YY --> EYY-YY
args: ["(\\s\\/\\sEpisodio|\\s\\/\\sEpisodi|\\sEpisodio|\\s\\|\\sEpisodio|\\sEpisodi)\\s\\[", "E"]
- name: re_replace #/ Completa [episodi YY-YY --> EYY-YY
args: ["(\\s\\/\\sCompleta\\s\\[episodi\\s)", "E"]
- name: re_replace #remove di YY] | remove /YY]
args: ["(\\sdi\\s\\d{1,2}|\\/\\d{1,2})\\]", " "]
- name: re_replace #remove various
args: ["(Serie completa|Completa|\\[in pausa\\])", ""]
# fine prova
- name: re_replace # S01 E01 to S01E01
args: ["(?i)\\bS(\\d+)\\sE(\\d+)\\b", "S$1E$2"]
- name: re_replace # 01x01 to S01E01
args: ["(?i)(\\d{2})x(\\d+)", "S$1E$2"]
- name: re_replace # 1x01 to S01E01
args: ["(?i)\\b(\\d{1})x(\\d+)", "S0$1E$2"]
- name: re_replace # Stagione X --> S0X
args: ["(?i)\\bStagion[ei]\\s?(\\d{1})\\b|\\bSeason'?s?\\s?(\\d{1})\\b", "S0$1$2"]
- name: re_replace # Stagione XX --> SXX
args: ["(?i)\\bStagion[ei]\\s?(\\d{2,})\\b|\\bSeason'?s?\\s?(\\d{2,})\\b", "S$1$2"]
- name: re_replace # Episodio 4 to E4
args: ["(?i)\\b(?:[\\/\\|]?Episodio\\s?(\\d+)|Puntata\\s?(\\d+))", "E$1$2"]
- name: re_replace # Episodi 4 5 to E04-05
args: ["(?i)\\b(?:Puntate\\s*)(\\d+)\\s?(\\d+)", "E0$1-0$2"]
- name: re_replace # rimozioni varie
args: ["(?i)(Serie completa|Completat?a?|in pausa)", ""]
title: # long titles?
optional: true
selector: a[title][href^="index.php?page=torrent-details"]
attribute: title
filters:
- name: replace
args: ["Vedi Dettagli: ", ""]
# inizio prova
- name: re_replace # replace special characters with " " (space)
args: ["[^a-zA-Z0-9]|\\.", " "]
args: ["[^a-zA-Z0-9\\s]|\\.", " "]
- name: re_replace # replace multiple spaces
args: ["[ ]{2,}", " "]
# normalize to SXXEYY format
- name: re_replace
args: ["(\\d{2})x(\\d{2})", "S$1E$2"]
- name: re_replace
args: ["(\\d{1})x(\\d{2})", "S0$1E$2"]
- name: re_replace #Stagione X --> S0X
args: ["Stagione (\\d{0,1}\\s)", "S0$1"]
- name: re_replace #Stagione XX --> SXX
args: ["Stagione (\\d{2}\\s)", "S$1"]
- name: re_replace #/ Episodio [YY-YY --> EYY-YY
args: ["(\\s\\/\\sEpisodio|\\s\\/\\sEpisodi|\\sEpisodio|\\s\\|\\sEpisodio|\\sEpisodi)\\s\\[", "E"]
- name: re_replace #/ Completa [episodi YY-YY --> EYY-YY
args: ["(\\s\\/\\sCompleta\\s\\[episodi\\s)", "E"]
- name: re_replace #remove di YY] | remove /YY]
args: ["(\\sdi\\s\\d{1,2}|\\/\\d{1,2})\\]", " "]
- name: re_replace #remove various
args: ["(Serie completa|Completa|\\[in pausa\\])", ""]
# fine prova
- name: re_replace # S01 E01 to S01E01
args: ["(?i)\\bS(\\d+)\\sE(\\d+)\\b", "S$1E$2"]
- name: re_replace # 01x01 to S01E01
args: ["(?i)(\\d{2})x(\\d+)", "S$1E$2"]
- name: re_replace # 1x01 to S01E01
args: ["(?i)\\b(\\d{1})x(\\d+)", "S0$1E$2"]
- name: re_replace # Stagione X --> S0X
args: ["(?i)\\bStagion[ei]\\s?(\\d{1})\\b|\\bSeason'?s?\\s?(\\d{1})\\b", "S0$1$2"]
- name: re_replace # Stagione XX --> SXX
args: ["(?i)\\bStagion[ei]\\s?(\\d{2,})\\b|\\bSeason'?s?\\s?(\\d{2,})\\b", "S$1$2"]
- name: re_replace # Episodio 4 to E4
args: ["(?i)\\b(?:[\\/\\|]?Episodio\\s?(\\d+)|Puntata\\s?(\\d+))", "E$1$2"]
- name: re_replace # Episodi 4 5 to E04-05
args: ["(?i)\\b(?:Puntate\\s*)(\\d+)\\s?(\\d+)", "E0$1-0$2"]
- name: re_replace # rimozioni varie
args: ["(?i)(Serie completa|Completat?a?|in pausa)", ""]
category:
selector: a[href^="index.php?page=torrents&category="]
attribute: href

View File

@@ -6,7 +6,7 @@
type: public
encoding: UTF-8
links:
- https://www.torrent9.blue/
- https://ww2.torrent9.blue/
legacylinks:
- http://www.torrent9.ec/
- http://www.torrent9.red/
@@ -15,6 +15,7 @@
- http://www.torrent9.cc/
- http://www.torrent9.pe/
- http://www.torrent9.blue/
- https://www.torrent9.blue/
caps:
categorymappings:

View File

@@ -6,12 +6,14 @@
type: semi-private
encoding: UTF-8
links:
- https://ww2.yggtorrent.is/
- https://ww4.yggtorrent.is/
legacylinks:
- https://yggtorrent.is/
- https://yggtorrent.com/
- https://ww1.yggtorrent.com/
- https://ww1.yggtorrent.is/
- https://ww2.yggtorrent.is/
- https://ww3.yggtorrent.is/
caps:
categorymappings:

View File

@@ -1,267 +0,0 @@
using System;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Text;
using Autofac;
using AutoMapper;
using Jackett.Common.Models;
using Jackett.Common.Models.Config;
using Jackett.Common.Plumbing;
using Jackett.Common.Services;
using Jackett.Common.Services.Interfaces;
using Jackett.Common.Utils.Clients;
using NLog;
using NLog.Config;
using NLog.Targets;
namespace Jackett.Common
{
public class Engine
{
public static Type WebClientType;
private static IContainer container = null;
private static bool _automapperInitialised = false;
public static void BuildContainer(RuntimeSettings settings, params Autofac.Module[] ApplicationSpecificModules)
{
var builder = new ContainerBuilder();
SetupLogging(settings, builder);
if (_automapperInitialised == false)
{
//Automapper only likes being initialized once per app domain.
//Since we can restart Jackett from the command line it's possible that we'll build the container more than once. (tests do this too)
InitAutomapper();
_automapperInitialised = true;
}
builder.RegisterModule(new JackettModule(settings));
foreach (var module in ApplicationSpecificModules)
{
builder.RegisterModule(module);
}
container = builder.Build();
// create PID file early
if (!string.IsNullOrWhiteSpace(settings.PIDFile))
{
try
{
var proc = Process.GetCurrentProcess();
File.WriteAllText(settings.PIDFile, proc.Id.ToString());
}
catch (Exception e)
{
Logger.Error(e, "Error while creating the PID file");
}
}
}
private static void InitAutomapper()
{
Mapper.Initialize(cfg =>
{
cfg.CreateMap<WebClientByteResult, WebClientStringResult>().ForMember(x => x.Content, opt => opt.Ignore()).AfterMap((be, str) =>
{
var encoding = be.Request.Encoding ?? Encoding.UTF8;
str.Content = encoding.GetString(be.Content);
});
cfg.CreateMap<WebClientStringResult, WebClientByteResult>().ForMember(x => x.Content, opt => opt.Ignore()).AfterMap((str, be) =>
{
if (!string.IsNullOrEmpty(str.Content))
{
var encoding = str.Request.Encoding ?? Encoding.UTF8;
be.Content = encoding.GetBytes(str.Content);
}
});
cfg.CreateMap<WebClientStringResult, WebClientStringResult>();
cfg.CreateMap<WebClientByteResult, WebClientByteResult>();
cfg.CreateMap<ReleaseInfo, ReleaseInfo>();
cfg.CreateMap<ReleaseInfo, TrackerCacheResult>().AfterMap((r, t) =>
{
if (r.Category != null)
{
var CategoryDesc = string.Join(", ", r.Category.Select(x => TorznabCatType.GetCatDesc(x)).Where(x => !string.IsNullOrEmpty(x)));
t.CategoryDesc = CategoryDesc;
}
else
{
t.CategoryDesc = "";
}
});
});
}
public static IContainer GetContainer()
{
return container;
}
public static IConfigurationService ConfigService
{
get
{
return container.Resolve<IConfigurationService>();
}
}
public static IProcessService ProcessService
{
get
{
return container.Resolve<IProcessService>();
}
}
public static IServiceConfigService ServiceConfig
{
get
{
return container.Resolve<IServiceConfigService>();
}
}
public static ITrayLockService LockService
{
get
{
return container.Resolve<ITrayLockService>();
}
}
public static IServerService Server
{
get
{
return container.Resolve<IServerService>();
}
}
public static ServerConfig ServerConfig
{
get
{
return container.Resolve<ServerConfig>();
}
}
public static IRunTimeService RunTime
{
get
{
return container.Resolve<IRunTimeService>();
}
}
public static Logger Logger
{
get
{
return container.Resolve<Logger>();
}
}
public static ISecuityService SecurityService
{
get
{
return container.Resolve<ISecuityService>();
}
}
private static void SetupLogging(RuntimeSettings settings, ContainerBuilder builder)
{
var logFileName = settings.CustomLogFileName ?? "log.txt";
var logLevel = settings.TracingEnabled ? LogLevel.Debug : LogLevel.Info;
// Add custom date time format renderer as the default is too long
ConfigurationItemFactory.Default.LayoutRenderers.RegisterDefinition("simpledatetime", typeof(Utils.LoggingSetup.SimpleDateTimeRenderer));
var logConfig = new LoggingConfiguration();
var logFile = new FileTarget();
logConfig.AddTarget("file", logFile);
logFile.Layout = "${longdate} ${level} ${message} ${exception:format=ToString}";
logFile.FileName = Path.Combine(settings.DataFolder, logFileName);
logFile.ArchiveFileName = "log.{#####}.txt";
logFile.ArchiveAboveSize = 500000;
logFile.MaxArchiveFiles = 5;
logFile.KeepFileOpen = false;
logFile.ArchiveNumbering = ArchiveNumberingMode.DateAndSequence;
var logFileRule = new LoggingRule("*", logLevel, logFile);
logConfig.LoggingRules.Add(logFileRule);
var logConsole = new ColoredConsoleTarget();
logConfig.AddTarget("console", logConsole);
logConsole.Layout = "${simpledatetime} ${level} ${message} ${exception:format=ToString}";
var logConsoleRule = new LoggingRule("*", logLevel, logConsole);
logConfig.LoggingRules.Add(logConsoleRule);
var logService = new LogCacheService();
logConfig.AddTarget("service", logService);
var serviceRule = new LoggingRule("*", logLevel, logService);
logConfig.LoggingRules.Add(serviceRule);
LogManager.Configuration = logConfig;
if (builder != null)
{
builder.RegisterInstance(LogManager.GetCurrentClassLogger()).SingleInstance();
}
}
public static void SetLogLevel(LogLevel level)
{
foreach (var rule in LogManager.Configuration.LoggingRules)
{
if (level == LogLevel.Debug)
{
if (!rule.Levels.Contains(LogLevel.Debug))
{
rule.EnableLoggingForLevel(LogLevel.Debug);
}
}
else
{
if (rule.Levels.Contains(LogLevel.Debug))
{
rule.DisableLoggingForLevel(LogLevel.Debug);
}
}
}
LogManager.ReconfigExistingLoggers();
}
public static void Exit(int exitCode)
{
try
{
if (Engine.ServerConfig != null &&
Engine.ServerConfig.RuntimeSettings != null &&
!string.IsNullOrWhiteSpace(Engine.ServerConfig.RuntimeSettings.PIDFile))
{
var PIDFile = Engine.ServerConfig.RuntimeSettings.PIDFile;
if (File.Exists(PIDFile))
{
Engine.Logger.Info("Deleting PID file " + PIDFile);
File.Delete(PIDFile);
}
}
}
catch (Exception e)
{
Logger.Error(e, "Error while deleting the PID file");
}
Environment.Exit(exitCode);
}
public static void SaveServerConfig()
{
ConfigService.SaveConfig(ServerConfig);
}
}
}

View File

@@ -0,0 +1,238 @@
using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using AngleSharp.Parser.Html;
using Jackett.Common.Models;
using Jackett.Common.Models.IndexerConfig;
using Jackett.Common.Services.Interfaces;
using Jackett.Common.Utils;
using Jackett.Common.Utils.Clients;
using Newtonsoft.Json.Linq;
using NLog;
namespace Jackett.Common.Indexers
{
public class Pier720 : BaseWebIndexer
{
private string LoginUrl { get { return SiteLink + "ucp.php?mode=login"; } }
private string SearchUrl { get { return SiteLink + "search.php"; } }
private new ConfigurationDataBasicLoginWithRSSAndDisplay configData
{
get { return (ConfigurationDataBasicLoginWithRSSAndDisplay)base.configData; }
set { base.configData = value; }
}
public Pier720(IIndexerConfigurationService configService, WebClient wc, Logger l, IProtectionService ps)
: base(name: "720pier",
description: "720pier is a RUSSIAN Private Torrent Tracker for HD SPORTS",
link: "http://720pier.ru/",
caps: TorznabUtil.CreateDefaultTorznabTVCaps(),
configService: configService,
client: wc,
logger: l,
p: ps,
configData: new ConfigurationDataBasicLoginWithRSSAndDisplay())
{
Encoding = Encoding.UTF8;
Language = "ru-ru";
Type = "private";
AddCategoryMapping(32, TorznabCatType.TVSport, "Basketball");
AddCategoryMapping(34, TorznabCatType.TVSport, "Basketball - NBA");
AddCategoryMapping(87, TorznabCatType.TVSport, "Basketball - NBA Playoffs");
AddCategoryMapping(81, TorznabCatType.TVSport, "Basketball - NBA Playoffs - 2016");
AddCategoryMapping(95, TorznabCatType.TVSport, "Basketball - NBA Playoffs - 2017");
AddCategoryMapping(58, TorznabCatType.TVSport, "Basketball - NBA (до 2015 г.)");
AddCategoryMapping(52, TorznabCatType.TVSport, "Basketball - NCAA");
AddCategoryMapping(82, TorznabCatType.TVSport, "Basketball - WNBA");
AddCategoryMapping(36, TorznabCatType.TVSport, "Basketball - European basketball");
AddCategoryMapping(37, TorznabCatType.TVSport, "Basketball - World Championship");
AddCategoryMapping(51, TorznabCatType.TVSport, "Basketball - Reviews and highlights");
AddCategoryMapping(41, TorznabCatType.TVSport, "Basketball - Other");
AddCategoryMapping(38, TorznabCatType.TVSport, "Basketball - Olympic Games");
AddCategoryMapping(42, TorznabCatType.TVSport, "Football");
AddCategoryMapping(43, TorznabCatType.TVSport, "Football - NFL");
AddCategoryMapping(66, TorznabCatType.TVSport, "Football - Super Bowls");
AddCategoryMapping(53, TorznabCatType.TVSport, "Football - NCAA");
AddCategoryMapping(99, TorznabCatType.TVSport, "Football - CFL");
AddCategoryMapping(54, TorznabCatType.TVSport, "Football - Reviews and highlights");
AddCategoryMapping(97, TorznabCatType.TVSport, "Football - Documentaries");
AddCategoryMapping(44, TorznabCatType.TVSport, "Football - Other");
AddCategoryMapping(46, TorznabCatType.TVSport, "Hockey");
AddCategoryMapping(48, TorznabCatType.TVSport, "Hockey - NHL");
AddCategoryMapping(88, TorznabCatType.TVSport, "Hockey - NHL Playoffs");
AddCategoryMapping(93, TorznabCatType.TVSport, "Hockey - NHL Playoffs - 2017");
AddCategoryMapping(80, TorznabCatType.TVSport, "Hockey - NHL Playoffs - 2016");
AddCategoryMapping(65, TorznabCatType.TVSport, "Hockey - Stanley Cup Finals");
AddCategoryMapping(69, TorznabCatType.TVSport, "Hockey - Stanley Cup Finals - 2005-2014");
AddCategoryMapping(70, TorznabCatType.TVSport, "Hockey - Stanley Cup Finals - 2003");
AddCategoryMapping(92, TorznabCatType.TVSport, "Hockey - NCAA");
AddCategoryMapping(49, TorznabCatType.TVSport, "Hockey - World Championship");
AddCategoryMapping(68, TorznabCatType.TVSport, "Hockey - Documentaries");
AddCategoryMapping(64, TorznabCatType.TVSport, "Hockey - Reviews and highlights");
AddCategoryMapping(50, TorznabCatType.TVSport, "Hockey - Other");
AddCategoryMapping(55, TorznabCatType.TVSport, "Baseball");
AddCategoryMapping(71, TorznabCatType.TVSport, "Baseball - MLB");
AddCategoryMapping(72, TorznabCatType.TVSport, "Baseball - Other");
AddCategoryMapping(85, TorznabCatType.TVSport, "Baseball - Reviews, highlights, documentaries");
AddCategoryMapping(59, TorznabCatType.TVSport, "Soccer");
AddCategoryMapping(61, TorznabCatType.TVSport, "Soccer - English soccer");
AddCategoryMapping(86, TorznabCatType.TVSport, "Soccer - UEFA");
AddCategoryMapping(100, TorznabCatType.TVSport, "Soccer - MLS");
AddCategoryMapping(62, TorznabCatType.TVSport, "Soccer - Other tournaments, championships");
AddCategoryMapping(63, TorznabCatType.TVSport, "Soccer - World Championships");
AddCategoryMapping(98, TorznabCatType.TVSport, "Soccer - FIFA World Cup");
AddCategoryMapping(45, TorznabCatType.TVSport, "Other sports");
AddCategoryMapping(79, TorznabCatType.TVSport, "Other sports - Rugby");
AddCategoryMapping(78, TorznabCatType.TVSport, "Other sports - Lacrosse");
AddCategoryMapping(77, TorznabCatType.TVSport, "Other sports - Cricket");
AddCategoryMapping(76, TorznabCatType.TVSport, "Other sports - Volleyball");
AddCategoryMapping(75, TorznabCatType.TVSport, "Other sports - Tennis");
AddCategoryMapping(74, TorznabCatType.TVSport, "Other sports - Fighting");
AddCategoryMapping(73, TorznabCatType.TVSport, "Other sports - Auto, moto racing");
AddCategoryMapping(91, TorznabCatType.TVSport, "Other sports - Olympic Games");
AddCategoryMapping(94, TorznabCatType.TVSport, "Other sports - Misc");
AddCategoryMapping(56, TorznabCatType.TVSport, "Sports on tv");
}
public override async Task<IndexerConfigurationStatus> ApplyConfiguration(JToken configJson)
{
LoadValuesFromJson(configJson);
var pairs = new Dictionary<string, string>
{
{ "username", configData.Username.Value },
{ "password", configData.Password.Value },
{ "redirect", "/" },
{ "login", "Login" }
};
var result = await RequestLoginAndFollowRedirect(LoginUrl, pairs, null, true, null, LoginUrl, true);
await ConfigureIfOK(result.Cookies, result.Content != null && result.Content.Contains("ucp.php?mode=logout&"), () =>
{
var errorMessage = result.Content;
throw new ExceptionWithConfigData(errorMessage, configData);
});
return IndexerConfigurationStatus.RequiresTesting;
}
protected override async Task<IEnumerable<ReleaseInfo>> PerformQuery(TorznabQuery query)
{
var releases = new List<ReleaseInfo>();
var searchString = query.GetQueryString();
WebClientStringResult results = null;
var queryCollection = new NameValueCollection();
queryCollection.Add("st", "0");
queryCollection.Add("sd", "d");
queryCollection.Add("sk", "t");
queryCollection.Add("tracker_search", "torrent");
queryCollection.Add("t", "0");
queryCollection.Add("submit", "Search");
queryCollection.Add("sr", "topics");
//queryCollection.Add("sr", "posts");
//queryCollection.Add("ch", "99999");
// if the search string is empty use the getnew view
if (string.IsNullOrWhiteSpace(searchString))
{
queryCollection.Add("search_id", "active_topics");
queryCollection.Add("ot", "1");
}
else // use the normal search
{
searchString = searchString.Replace("-", " ");
queryCollection.Add("keywords", searchString);
queryCollection.Add("sf", "titleonly");
queryCollection.Add("sr", "topics");
queryCollection.Add("pt", "t");
queryCollection.Add("ot", "1");
}
var searchUrl = SearchUrl + "?" + queryCollection.GetQueryString();
results = await RequestStringWithCookies(searchUrl);
if (!results.Content.Contains("ucp.php?mode=logout"))
{
await ApplyConfiguration(null);
results = await RequestStringWithCookies(searchUrl);
}
try
{
string RowsSelector = "ul.topics > li.row";
var ResultParser = new HtmlParser();
var SearchResultDocument = ResultParser.Parse(results.Content);
var Rows = SearchResultDocument.QuerySelectorAll(RowsSelector);
foreach (var Row in Rows)
{
try
{
var release = new ReleaseInfo();
release.MinimumRatio = 1;
release.MinimumSeedTime = 0;
var qDetailsLink = Row.QuerySelector("a.topictitle");
release.Title = qDetailsLink.TextContent;
release.Comments = new Uri(SiteLink + qDetailsLink.GetAttribute("href"));
release.Guid = release.Comments;
var detailsResult = await RequestStringWithCookies(SiteLink + qDetailsLink.GetAttribute("href"));
var DetailsResultDocument = ResultParser.Parse(detailsResult.Content);
var qDownloadLink = DetailsResultDocument.QuerySelector("table.table2 > tbody > tr > td > a[href^=\"/download/torrent.php?id\"]");
release.Link = new Uri(SiteLink + qDownloadLink.GetAttribute("href"));
release.Seeders = ParseUtil.CoerceInt(Row.QuerySelector("span.seed").TextContent);
release.Peers = ParseUtil.CoerceInt(Row.QuerySelector("span.leech").TextContent) + release.Seeders;
release.Grabs = ParseUtil.CoerceLong(Row.QuerySelector("span.complet").TextContent);
var author = Row.QuerySelector("dd.lastpost > span");
var timestr = author.TextContent.Split('\n')[4].Trim();
release.PublishDate = DateTimeUtil.FromUnknown(timestr, "UK");
var forum = Row.QuerySelector("a[href^=\"./viewforum.php?f=\"]");
var forumid = forum.GetAttribute("href").Split('=')[1];
release.Category = MapTrackerCatToNewznab(forumid);
var size = Row.QuerySelector("dl.row-item > dt > div.list-inner > div[style^=\"float:right\"]").TextContent;
size = size.Replace("GiB", "GB");
size = size.Replace("MiB", "MB");
size = size.Replace("KiB", "KB");
release.Size = ReleaseInfo.GetBytes(size);
release.DownloadVolumeFactor = 1;
release.UploadVolumeFactor = 1;
releases.Add(release);
}
catch (Exception ex)
{
logger.Error(string.Format("{0}: Error while parsing row '{1}':\n\n{2}", ID, Row.OuterHtml, ex));
}
}
}
catch (Exception ex)
{
OnParseError(results.Content, ex);
}
return releases;
}
}
}

View File

@@ -177,13 +177,6 @@ namespace Jackett.Common.Indexers
//TODO: Remove this section once users have moved off DPAPI
private bool MigratedFromDPAPI(JToken jsonConfig)
{
if (EnvironmentUtil.IsRunningLegacyOwin)
{
//Still running legacy Owin and using the DPAPI, we don't want to migrate
logger.Debug(ID + " - Running Owin, no need to migrate from DPAPI");
return false;
}
bool runningOnDotNetCore = RuntimeInformation.FrameworkDescription.IndexOf("core", StringComparison.OrdinalIgnoreCase) >= 0;
bool isWindows = Environment.OSVersion.Platform == PlatformID.Win32NT;

View File

@@ -0,0 +1,918 @@
using System;
using System.Collections.Generic;
using System.Collections.Specialized;
using System.Globalization;
using System.IO;
using System.Linq;
using System.Reflection;
using System.Text;
using System.Text.RegularExpressions;
using System.Threading.Tasks;
using CsQuery;
using Jackett.Common.Helpers;
using Jackett.Common.Models;
using Jackett.Common.Models.IndexerConfig.Bespoke;
using Jackett.Common.Services.Interfaces;
using Jackett.Common.Utils;
using Jackett.Common.Utils.Clients;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using NLog;
namespace Jackett.Common.Indexers
{
/// <summary>s
/// Provider for Nordicbits Private Tracker
/// </summary>
public class Nordicbits : BaseCachingWebIndexer
{
private string LoginUrl => SiteLink + "login.php";
private string LoginCheckUrl => SiteLink + "takelogin.php";
private string SearchUrl => SiteLink + "browse.php";
private string TorrentCommentUrl => SiteLink + "details.php?id={id}&comonly=1#page1";
private string TorrentDescriptionUrl => SiteLink + "details.php?id={id}";
private string TorrentDownloadUrl => SiteLink + "download.php?torrent={id}&torrent_pass={passkey}";
private bool Latency => ConfigData.Latency.Value;
private bool DevMode => ConfigData.DevMode.Value;
private bool CacheMode => ConfigData.HardDriveCache.Value;
private static string Directory => Path.Combine(Path.GetTempPath(), "Jackett", MethodBase.GetCurrentMethod().DeclaringType?.Name);
private readonly Dictionary<string, string> _emulatedBrowserHeaders = new Dictionary<string, string>();
private CQ _fDom;
private ConfigurationDataNordicbits ConfigData => (ConfigurationDataNordicbits)configData;
public Nordicbits(IIndexerConfigurationService configService, Utils.Clients.WebClient w, Logger l, IProtectionService ps)
: base(
name: "Nordicbits",
description: "Nordicbits is a Danish Private site for MOVIES / TV / GENERAL",
link: "https://nordicb.org/",
caps: new TorznabCapabilities(),
configService: configService,
client: w,
logger: l,
p: ps,
configData: new ConfigurationDataNordicbits())
{
Encoding = Encoding.GetEncoding("iso-8859-1");
Language = "da-dk";
Type = "private";
TorznabCaps.SupportsImdbSearch = false;
// Apps
AddCategoryMapping("cat=63", TorznabCatType.PCPhoneAndroid, "APPS - Android");
AddCategoryMapping("cat=17", TorznabCatType.PC, "APPS - MAC");
AddCategoryMapping("cat=12", TorznabCatType.PCMac, "APPS - Windows");
AddCategoryMapping("cat=62", TorznabCatType.PCPhoneIOS, "APPS - IOS");
AddCategoryMapping("cat=64", TorznabCatType.PC, "APPS - Linux");
// Books
AddCategoryMapping("cat=54", TorznabCatType.AudioAudiobook, "Books - Audiobooks");
AddCategoryMapping("cat=9", TorznabCatType.BooksEbook, "Books - E-Books");
// Games
AddCategoryMapping("cat=24", TorznabCatType.PCGames, "Games - PC");
AddCategoryMapping("cat=53", TorznabCatType.Console, "Games - Nintendo");
AddCategoryMapping("cat=49", TorznabCatType.ConsolePS4, "Games - Playstation");
AddCategoryMapping("cat=51", TorznabCatType.ConsoleXbox, "Games - XBOX");
// Movies
AddCategoryMapping("cat=35", TorznabCatType.Movies3D, "Movies - 3D");
AddCategoryMapping("cat=42", TorznabCatType.MoviesUHD, "Movies - 4K/2160p");
AddCategoryMapping("cat=47", TorznabCatType.MoviesUHD, "Movies - 4k/2160p Boxset");
AddCategoryMapping("cat=15", TorznabCatType.MoviesBluRay, "Movies - BluRay");
AddCategoryMapping("cat=58", TorznabCatType.MoviesHD, "Movies - Remux");
AddCategoryMapping("cat=16", TorznabCatType.MoviesDVD, "Movies - DVD Boxset");
AddCategoryMapping("cat=6", TorznabCatType.MoviesDVD, "Movies - DVD");
AddCategoryMapping("cat=21", TorznabCatType.MoviesHD, "Movies - HD-1080p");
AddCategoryMapping("cat=19", TorznabCatType.MoviesHD, "Movies - HD-1080p Boxset");
AddCategoryMapping("cat=22", TorznabCatType.MoviesHD, "Movies - HD-720p");
AddCategoryMapping("cat=20", TorznabCatType.MoviesHD, "Movies - HD-720p Boxset");
AddCategoryMapping("cat=25", TorznabCatType.MoviesHD, "Movies - Kids");
AddCategoryMapping("cat=10", TorznabCatType.MoviesSD, "Movies - SD");
AddCategoryMapping("cat=23", TorznabCatType.MoviesSD, "Movies - MP4 Tablet");
AddCategoryMapping("cat=65", TorznabCatType.XXX, "Movies - Porn");
// Music
AddCategoryMapping("cat=28", TorznabCatType.AudioLossless, "Music - FLAC");
AddCategoryMapping("cat=60", TorznabCatType.AudioLossless, "Music - FLAC Boxset");
AddCategoryMapping("cat=4", TorznabCatType.AudioMP3, "Music - MP3");
AddCategoryMapping("cat=59", TorznabCatType.AudioMP3, "Music - MP3 Boxset");
AddCategoryMapping("cat=61", TorznabCatType.AudioMP3, "Music - Musicvideos Boxset");
AddCategoryMapping("cat=1", TorznabCatType.AudioMP3, "Music - Musicvideos");
// Series
AddCategoryMapping("cat=48", TorznabCatType.TVUHD, "TV - HD-4K/2160p");
AddCategoryMapping("cat=57", TorznabCatType.TVUHD, "TV - HD-4K/2160p Boxset");
AddCategoryMapping("cat=11", TorznabCatType.TVSD, "TV - Boxset");
AddCategoryMapping("cat=7", TorznabCatType.TVHD, "TV - HD-1080p");
AddCategoryMapping("cat=31", TorznabCatType.TVHD, "TV - HD-1080p Boxset");
AddCategoryMapping("cat=30", TorznabCatType.TVHD, "TV - HD-720p");
AddCategoryMapping("cat=32", TorznabCatType.TVHD, "TV - HD-720p Boxset");
AddCategoryMapping("cat=5", TorznabCatType.TVSD, "TV - SD");
AddCategoryMapping("cat=66", TorznabCatType.TVSport, "TV - SD/HD Mixed");
}
/// <summary>
/// Configure our FADN Provider
/// </summary>
/// <param name="configJson">Our params in Json</param>
/// <returns>Configuration state</returns>
public override async Task<IndexerConfigurationStatus> ApplyConfiguration(JToken configJson)
{
// Retrieve config values set by Jackett's user
LoadValuesFromJson(configJson);
// Check & Validate Config
ValidateConfig();
// Setting our data for a better emulated browser (maximum security)
// TODO: Encoded Content not supported by Jackett at this time
// emulatedBrowserHeaders.Add("Accept-Encoding", "gzip, deflate");
// If we want to simulate a browser
if (ConfigData.Browser.Value)
{
// Clean headers
_emulatedBrowserHeaders.Clear();
// Inject headers
_emulatedBrowserHeaders.Add("Accept", ConfigData.HeaderAccept.Value);
_emulatedBrowserHeaders.Add("Accept-Language", ConfigData.HeaderAcceptLang.Value);
_emulatedBrowserHeaders.Add("DNT", Convert.ToInt32(ConfigData.HeaderDnt.Value).ToString());
_emulatedBrowserHeaders.Add("Upgrade-Insecure-Requests", Convert.ToInt32(ConfigData.HeaderUpgradeInsecure.Value).ToString());
_emulatedBrowserHeaders.Add("User-Agent", ConfigData.HeaderUserAgent.Value);
_emulatedBrowserHeaders.Add("Referer", LoginUrl);
}
await DoLogin();
return IndexerConfigurationStatus.RequiresTesting;
}
/// <summary>
/// Perform login to racker
/// </summary>
/// <returns></returns>
private async Task DoLogin()
{
// Build WebRequest for index
var myIndexRequest = new Utils.Clients.WebRequest()
{
Type = RequestType.GET,
Url = SiteLink,
Headers = _emulatedBrowserHeaders,
Encoding = Encoding
};
// Get index page for cookies
Output("\nGetting index page (for cookies).. with " + SiteLink);
var indexPage = await webclient.GetString(myIndexRequest);
// Building login form data
var pairs = new Dictionary<string, string> {
{ "username", ConfigData.Username.Value },
{ "password", ConfigData.Password.Value }
};
// Build WebRequest for login
var myRequestLogin = new Utils.Clients.WebRequest()
{
Type = RequestType.GET,
Url = LoginUrl,
Headers = _emulatedBrowserHeaders,
Cookies = indexPage.Cookies,
Referer = SiteLink,
Encoding = Encoding
};
// Get login page -- (not used, but simulation needed by tracker security's checks)
LatencyNow();
Output("\nGetting login page (user simulation).. with " + LoginUrl);
await webclient.GetString(myRequestLogin);
// Build WebRequest for submitting authentification
var request = new Utils.Clients.WebRequest()
{
PostData = pairs,
Referer = LoginUrl,
Type = RequestType.POST,
Url = LoginCheckUrl,
Headers = _emulatedBrowserHeaders,
Cookies = indexPage.Cookies,
Encoding = Encoding
};
// Perform loggin
LatencyNow();
Output("\nPerform loggin.. with " + LoginCheckUrl);
var response = await webclient.GetString(request);
// Test if we are logged in
await ConfigureIfOK(response.Cookies, response.Cookies != null && response.Cookies.Contains("uid="), () =>
{
// Default error message
var message = "Error during attempt !";
// Parse redirect header
var redirectTo = response.RedirectingTo;
// Oops, unable to login
Output("-> Login failed: " + message, "error");
throw new ExceptionWithConfigData("Login failed: " + message, configData);
});
Output("\nCookies saved for future uses...");
ConfigData.CookieHeader.Value = indexPage.Cookies + " " + response.Cookies + " ts_username=" + ConfigData.Username.Value;
Output("\n-> Login Success\n");
}
/// <summary>
/// Check logged-in state for provider
/// </summary>
/// <returns></returns>
private async Task CheckLogin()
{
// Checking ...
Output("\n-> Checking logged-in state....");
var loggedInCheck = await RequestStringWithCookies(SearchUrl);
if (!loggedInCheck.Content.Contains("logout.php"))
{
// Cookie expired, renew session on provider
Output("-> Not logged, login now...\n");
await DoLogin();
}
else
{
// Already logged, session active
Output("-> Already logged, continue...\n");
}
}
/// <summary>
/// Execute our search query
/// </summary>
/// <param name="query">Query</param>
/// <returns>Releases</returns>
protected override async Task<IEnumerable<ReleaseInfo>> PerformQuery(TorznabQuery query)
{
var releases = new List<ReleaseInfo>();
var torrentRowList = new List<CQ>();
var exactSearchTerm = query.GetQueryString();
var searchUrl = SearchUrl;
// Check login before performing a query
await CheckLogin();
// Check cache first so we don't query the server (if search term used or not in dev mode)
if (!DevMode && !string.IsNullOrEmpty(exactSearchTerm))
{
lock (cache)
{
// Remove old cache items
CleanCache();
// Search in cache
var cachedResult = cache.FirstOrDefault(i => i.Query == exactSearchTerm);
if (cachedResult != null)
return cachedResult.Results.Select(s => (ReleaseInfo)s.Clone()).ToArray();
}
}
var SearchTerms = new List<string> { exactSearchTerm };
// duplicate search without diacritics
var baseSearchTerm = StringUtil.RemoveDiacritics(exactSearchTerm);
if (baseSearchTerm != exactSearchTerm)
SearchTerms.Add(baseSearchTerm);
foreach (var searchTerm in SearchTerms)
{
// Build our query
var request = BuildQuery(searchTerm, query, searchUrl);
// Getting results & Store content
var response = await RequestStringWithCookiesAndRetry(request, ConfigData.CookieHeader.Value);
_fDom = response.Content;
try
{
var firstPageRows = FindTorrentRows();
// Add them to torrents list
torrentRowList.AddRange(firstPageRows.Select(fRow => fRow.Cq()));
// If pagination available
int nbResults;
int pageLinkCount;
nbResults = 1;
pageLinkCount = 1;
// Check if we have a minimum of one result
if (firstPageRows.Length > 1)
{
// Retrieve total count on our alone page
nbResults = firstPageRows.Count();
}
else
{
// Check if no result
if (torrentRowList.Count == 0)
{
// No results found
Output("\nNo result found for your query, please try another search term ...\n", "info");
// No result found for this query
break;
}
}
Output("\nFound " + nbResults + " result(s) (+/- " + firstPageRows.Length + ") in " + pageLinkCount + " page(s) for this query !");
Output("\nThere are " + (firstPageRows.Length -2 ) + " results on the first page !");
// Loop on results
foreach (var tRow in torrentRowList.Skip(1).Take(torrentRowList.Count-2))
{
Output("Torrent #" + (releases.Count + 1));
// ID
var idOrig = tRow.Find("td:eq(1) > a:eq(0)").Attr("href").Split('=')[1];
var id = idOrig.Substring(0, idOrig.Length - 4);
Output("ID: " + id);
// Release Name
var name = tRow.Find("td:eq(1) > a:eq(0)").Text();
// Category
string categoryID = tRow.Find("td:eq(0) > a:eq(0)").Attr("href").Split('?').Last();
var newznab = MapTrackerCatToNewznab(categoryID);
Output("Category: " + MapTrackerCatToNewznab(categoryID).First().ToString() + " (" + categoryID + ")");
// Seeders
int seeders = ParseUtil.CoerceInt(Regex.Match(tRow.Find("td:eq(9)").Text(), @"\d+").Value);
Output("Seeders: " + seeders);
// Leechers
int leechers = ParseUtil.CoerceInt(Regex.Match(tRow.Find("td:eq(10)").Text(), @"\d+").Value);
Output("Leechers: " + leechers);
// Files
int files = 1;
files = ParseUtil.CoerceInt(Regex.Match(tRow.Find("td:eq(4)").Text(), @"\d+").Value);
Output("Files: " + files);
// Completed
int completed = ParseUtil.CoerceInt(Regex.Match(tRow.Find("td:eq(8)").Text(), @"\d+").Value);
Output("Completed: " + completed);
// Size
var humanSize = tRow.Find("td:eq(7)").Text().ToLowerInvariant();
var size = ReleaseInfo.GetBytes(humanSize);
Output("Size: " + humanSize + " (" + size + " bytes)");
// Publish DateToString
var dateTimeOrig = tRow.Find("td:eq(6)").Text();
var datestr = Regex.Replace(dateTimeOrig, @"<[^>]+>|&nbsp;", "").Trim();
datestr = Regex.Replace(datestr, "Today", DateTime.Now.ToString("MMM dd yyyy"), RegexOptions.IgnoreCase);
datestr = Regex.Replace(datestr, "Yesterday", DateTime.Now.Date.AddDays(-1).ToString("MMM dd yyyy"), RegexOptions.IgnoreCase);
DateTime date = DateTimeUtil.FromUnknown(datestr, "DK");
Output("Released on: " + date);
// Torrent Details URL
var detailsLink = new Uri(TorrentDescriptionUrl.Replace("{id}", id.ToString()));
Output("Details: " + detailsLink.AbsoluteUri);
// Torrent Comments URL
var commentsLink = new Uri(TorrentCommentUrl.Replace("{id}", id.ToString()));
Output("Comments Link: " + commentsLink.AbsoluteUri);
// Torrent Download URL
var passkey = tRow.Find("td:eq(2) > a:eq(0)").Attr("href");
var key = Regex.Match(passkey, "(?<=torrent_pass\\=)([a-zA-z0-9]*)");
Uri downloadLink = new Uri(TorrentDownloadUrl.Replace("{id}", id.ToString()).Replace("{passkey}", key.ToString()));
Output("Download Link: " + downloadLink.AbsoluteUri);
// Building release infos
var release = new ReleaseInfo
{
Category = MapTrackerCatToNewznab(categoryID.ToString()),
Title = name,
Seeders = seeders,
Peers = seeders + leechers,
MinimumRatio = 1,
MinimumSeedTime = 172800,
PublishDate = date,
Size = size,
Files = files,
Grabs = completed,
Guid = detailsLink,
Comments = commentsLink,
Link = downloadLink
};
// IMDB
var imdbLink = tRow.Find("a[href*=\"http://imdb.com/title/\"]").First().Attr("href");
release.Imdb = ParseUtil.GetLongFromString(imdbLink);
if (tRow.Find("img[title=\"Free Torrent\"]").Length >= 1)
release.DownloadVolumeFactor = 0;
else if (tRow.Find("img[title=\"Halfleech\"]").Length >= 1)
release.DownloadVolumeFactor = 0.5;
else if (tRow.Find("img[title=\"90% Freeleech\"]").Length >= 1)
release.DownloadVolumeFactor = 0.1;
else
release.DownloadVolumeFactor = 1;
release.UploadVolumeFactor = 1;
releases.Add(release);
}
}
catch (Exception ex)
{
OnParseError("Error, unable to parse result \n" + ex.StackTrace, ex);
}
}
// Return found releases
return releases;
}
/// <summary>
/// Build query to process
/// </summary>
/// <param name="term">Term to search</param>
/// <param name="query">Torznab Query for categories mapping</param>
/// <param name="url">Search url for provider</param>
/// <param name="page">Page number to request</param>
/// <returns>URL to query for parsing and processing results</returns>
private string BuildQuery(string term, TorznabQuery query, string url, int page = 0)
{
var parameters = new NameValueCollection();
var categoriesList = MapTorznabCapsToTrackers(query);
string searchterm = term;
// Building our tracker query
parameters.Add("searchin", "title");
parameters.Add("incldead", "0");
// If search term provided
if (!string.IsNullOrWhiteSpace(query.ImdbID))
{
searchterm = "imdbsearch=" + query.ImdbID;
}
else if (!string.IsNullOrWhiteSpace(term))
{
searchterm = "search=" + WebUtilityHelpers.UrlEncode(term, Encoding.GetEncoding(28591));
}
else
{
// Showing all torrents (just for output function)
searchterm = "search=";
term = "all";
}
// Loop on categories and change the catagories for search purposes
for (int i = 0; i < categoriesList.Count; i++)
{
// APPS
if (new[] { "63", "17", "12", "62", "64" }.Any(c => categoriesList[i].Contains(categoriesList[i])))
{
categoriesList[i] = categoriesList[i].Replace("cat=", "cats5[]=");
}
// Books
if (new[] { "54", "9" }.Any(c => categoriesList[i].Contains(categoriesList[i])))
{
categoriesList[i] = categoriesList[i].Replace("cat=", "cats6[]=");
}
// Games
if (new[] { "24", "53", "49", "51" }.Any(c => categoriesList[i].Contains(categoriesList[i])))
{
categoriesList[i] = categoriesList[i].Replace("cat=", "cats3[]=");
}
// Movies
if (new[] { "35", "42", "47", "15", "58", "16", "6", "21", "19", "22", "20", "25", "10", "23", "65" }.Any(c => categoriesList[i].Contains(categoriesList[i])))
{
categoriesList[i] = categoriesList[i].Replace("cat=", "cats1[]=");
}
// Music
if (new[] { "28", "60", "4", "59", "61", "1" }.Any(c => categoriesList[i].Contains(categoriesList[i])))
{
categoriesList[i] = categoriesList[i].Replace("cat=", "cats4[]=");
}
// Series
if (new[] { "48", "57", "11", "7", "31", "30", "32", "5", "66" }.Any(c => categoriesList[i].Contains(categoriesList[i])))
{
categoriesList[i] = categoriesList[i].Replace("cat=", "cats2[]=");
}
}
// Build category search string
var CatQryStr = "";
foreach (var cat in categoriesList)
CatQryStr += cat + "&";
// Building our query
url += "?" + CatQryStr + searchterm + "&" + parameters.GetQueryString();
Output("\nBuilded query for \"" + term + "\"... " + url);
// Return our search url
return url;
}
/// <summary>
/// Switch Method for Querying
/// </summary>
/// <param name="request">URL created by Query Builder</param>
/// <returns>Results from query</returns>
private async Task<WebClientStringResult> QueryExec(string request)
{
WebClientStringResult results;
// Switch in we are in DEV mode with Hard Drive Cache or not
if (DevMode && CacheMode)
{
// Check Cache before querying and load previous results if available
results = await QueryCache(request);
}
else
{
// Querying tracker directly
results = await QueryTracker(request);
}
return results;
}
/// <summary>
/// Get Torrents Page from Cache by Query Provided
/// </summary>
/// <param name="request">URL created by Query Builder</param>
/// <returns>Results from query</returns>
private async Task<WebClientStringResult> QueryCache(string request)
{
WebClientStringResult results;
// Create Directory if not exist
System.IO.Directory.CreateDirectory(Directory);
// Clean Storage Provider Directory from outdated cached queries
CleanCacheStorage();
// Create fingerprint for request
var file = Directory + request.GetHashCode() + ".json";
// Checking modes states
if (System.IO.File.Exists(file))
{
// File exist... loading it right now !
Output("Loading results from hard drive cache ..." + request.GetHashCode() + ".json");
results = JsonConvert.DeserializeObject<WebClientStringResult>(System.IO.File.ReadAllText(file));
}
else
{
// No cached file found, querying tracker directly
results = await QueryTracker(request);
// Cached file didn't exist for our query, writing it right now !
Output("Writing results to hard drive cache ..." + request.GetHashCode() + ".json");
System.IO.File.WriteAllText(file, JsonConvert.SerializeObject(results));
}
return results;
}
/// <summary>
/// Get Torrents Page from Tracker by Query Provided
/// </summary>
/// <param name="request">URL created by Query Builder</param>
/// <returns>Results from query</returns>
private async Task<WebClientStringResult> QueryTracker(string request)
{
// Cache mode not enabled or cached file didn't exist for our query
Output("\nQuerying tracker for results....");
// Request our first page
LatencyNow();
var results = await RequestStringWithCookiesAndRetry(request, ConfigData.CookieHeader.Value, SearchUrl, _emulatedBrowserHeaders);
// Return results from tracker
return results;
}
/// <summary>
/// Clean Hard Drive Cache Storage
/// </summary>
/// <param name="force">Force Provider Folder deletion</param>
private void CleanCacheStorage(bool force = false)
{
// Check cleaning method
if (force)
{
// Deleting Provider Storage folder and all files recursively
Output("\nDeleting Provider Storage folder and all files recursively ...");
// Check if directory exist
if (System.IO.Directory.Exists(Directory))
{
// Delete storage directory of provider
System.IO.Directory.Delete(Directory, true);
Output("-> Storage folder deleted successfully.");
}
else
{
// No directory, so nothing to do
Output("-> No Storage folder found for this provider !");
}
}
else
{
var i = 0;
// Check if there is file older than ... and delete them
Output("\nCleaning Provider Storage folder... in progress.");
System.IO.Directory.GetFiles(Directory)
.Select(f => new System.IO.FileInfo(f))
.Where(f => f.LastAccessTime < DateTime.Now.AddMilliseconds(-Convert.ToInt32(ConfigData.HardDriveCacheKeepTime.Value)))
.ToList()
.ForEach(f =>
{
Output("Deleting cached file << " + f.Name + " >> ... done.");
f.Delete();
i++;
});
// Inform on what was cleaned during process
if (i > 0)
{
Output("-> Deleted " + i + " cached files during cleaning.");
}
else
{
Output("-> Nothing deleted during cleaning.");
}
}
}
/// <summary>
/// Generate a random fake latency to avoid detection on tracker side
/// </summary>
private void LatencyNow()
{
// Need latency ?
if (Latency)
{
var random = new Random(DateTime.Now.Millisecond);
var waiting = random.Next(Convert.ToInt32(ConfigData.LatencyStart.Value),
Convert.ToInt32(ConfigData.LatencyEnd.Value));
Output("\nLatency Faker => Sleeping for " + waiting + " ms...");
// Sleep now...
System.Threading.Thread.Sleep(waiting);
}
// Generate a random value in our range
}
/// <summary>
/// Find torrent rows in search pages
/// </summary>
/// <returns>JQuery Object</returns>
private CQ FindTorrentRows()
{
// Return all occurencis of torrents found
//return _fDom["#content > table > tr"];
return _fDom["# base_content > table.mainouter > tbody > tr > td.outer > div.article > table > tbody > tr:not(:first)"];
}
/// <summary>
/// Download torrent file from tracker
/// </summary>
/// <param name="link">URL string</param>
/// <returns></returns>
public override async Task<byte[]> Download(Uri link)
{
// Retrieving ID from link provided
var id = ParseUtil.CoerceInt(Regex.Match(link.AbsoluteUri, @"\d+").Value);
Output("Torrent Requested ID: " + id);
// Building login form data
var pairs = new Dictionary<string, string> {
{ "torrentid", id.ToString() },
{ "_", string.Empty } // ~~ Strange, blank param...
};
// Add emulated XHR request
_emulatedBrowserHeaders.Add("X-Prototype-Version", "1.6.0.3");
_emulatedBrowserHeaders.Add("X-Requested-With", "XMLHttpRequest");
// Get torrent file now
Output("Getting torrent file now....");
var response = await base.Download(link);
// Remove our XHR request header
_emulatedBrowserHeaders.Remove("X-Prototype-Version");
_emulatedBrowserHeaders.Remove("X-Requested-With");
// Return content
return response;
}
/// <summary>
/// Output message for logging or developpment (console)
/// </summary>
/// <param name="message">Message to output</param>
/// <param name="level">Level for Logger</param>
private void Output(string message, string level = "debug")
{
// Check if we are in dev mode
if (DevMode)
{
// Output message to console
Console.WriteLine(message);
}
else
{
// Send message to logger with level
switch (level)
{
default:
goto case "debug";
case "debug":
// Only if Debug Level Enabled on Jackett
if (logger.IsDebugEnabled)
{
logger.Debug(message);
}
break;
case "info":
logger.Info(message);
break;
case "error":
logger.Error(message);
break;
}
}
}
/// <summary>
/// Validate Config entered by user on Jackett
/// </summary>
private void ValidateConfig()
{
Output("\nValidating Settings ... \n");
// Check Username Setting
if (string.IsNullOrEmpty(ConfigData.Username.Value))
{
throw new ExceptionWithConfigData("You must provide a username for this tracker to login !", ConfigData);
}
else
{
Output("Validated Setting -- Username (auth) => " + ConfigData.Username.Value);
}
// Check Password Setting
if (string.IsNullOrEmpty(ConfigData.Password.Value))
{
throw new ExceptionWithConfigData("You must provide a password with your username for this tracker to login !", ConfigData);
}
else
{
Output("Validated Setting -- Password (auth) => " + ConfigData.Password.Value);
}
// Check Max Page Setting
if (!string.IsNullOrEmpty(ConfigData.Pages.Value))
{
try
{
Output("Validated Setting -- Max Pages => " + Convert.ToInt32(ConfigData.Pages.Value));
}
catch (Exception)
{
throw new ExceptionWithConfigData("Please enter a numeric maximum number of pages to crawl !", ConfigData);
}
}
else
{
throw new ExceptionWithConfigData("Please enter a maximum number of pages to crawl !", ConfigData);
}
// Check Latency Setting
if (ConfigData.Latency.Value)
{
Output("\nValidated Setting -- Latency Simulation enabled");
// Check Latency Start Setting
if (!string.IsNullOrEmpty(ConfigData.LatencyStart.Value))
{
try
{
Output("Validated Setting -- Latency Start => " + Convert.ToInt32(ConfigData.LatencyStart.Value));
}
catch (Exception)
{
throw new ExceptionWithConfigData("Please enter a numeric latency start in ms !", ConfigData);
}
}
else
{
throw new ExceptionWithConfigData("Latency Simulation enabled, Please enter a start latency !", ConfigData);
}
// Check Latency End Setting
if (!string.IsNullOrEmpty(ConfigData.LatencyEnd.Value))
{
try
{
Output("Validated Setting -- Latency End => " + Convert.ToInt32(ConfigData.LatencyEnd.Value));
}
catch (Exception)
{
throw new ExceptionWithConfigData("Please enter a numeric latency end in ms !", ConfigData);
}
}
else
{
throw new ExceptionWithConfigData("Latency Simulation enabled, Please enter a end latency !", ConfigData);
}
}
// Check Browser Setting
if (ConfigData.Browser.Value)
{
Output("\nValidated Setting -- Browser Simulation enabled");
// Check ACCEPT header Setting
if (string.IsNullOrEmpty(ConfigData.HeaderAccept.Value))
{
throw new ExceptionWithConfigData("Browser Simulation enabled, Please enter an ACCEPT header !", ConfigData);
}
else
{
Output("Validated Setting -- ACCEPT (header) => " + ConfigData.HeaderAccept.Value);
}
// Check ACCEPT-LANG header Setting
if (string.IsNullOrEmpty(ConfigData.HeaderAcceptLang.Value))
{
throw new ExceptionWithConfigData("Browser Simulation enabled, Please enter an ACCEPT-LANG header !", ConfigData);
}
else
{
Output("Validated Setting -- ACCEPT-LANG (header) => " + ConfigData.HeaderAcceptLang.Value);
}
// Check USER-AGENT header Setting
if (string.IsNullOrEmpty(ConfigData.HeaderUserAgent.Value))
{
throw new ExceptionWithConfigData("Browser Simulation enabled, Please enter an USER-AGENT header !", ConfigData);
}
else
{
Output("Validated Setting -- USER-AGENT (header) => " + ConfigData.HeaderUserAgent.Value);
}
}
else
{
// Browser simulation must be enabled (otherwhise, this provider will not work due to tracker's security)
throw new ExceptionWithConfigData("Browser Simulation must be enabled for this provider to work, please enable it !", ConfigData);
}
// Check Dev Cache Settings
if (ConfigData.HardDriveCache.Value)
{
Output("\nValidated Setting -- DEV Hard Drive Cache enabled");
// Check if Dev Mode enabled !
if (!ConfigData.DevMode.Value)
{
throw new ExceptionWithConfigData("Hard Drive is enabled but not in DEV MODE, Please enable DEV MODE !", ConfigData);
}
// Check Cache Keep Time Setting
if (!string.IsNullOrEmpty(ConfigData.HardDriveCacheKeepTime.Value))
{
try
{
Output("Validated Setting -- Cache Keep Time (ms) => " + Convert.ToInt32(ConfigData.HardDriveCacheKeepTime.Value));
}
catch (Exception)
{
throw new ExceptionWithConfigData("Please enter a numeric hard drive keep time in ms !", ConfigData);
}
}
else
{
throw new ExceptionWithConfigData("Hard Drive Cache enabled, Please enter a maximum keep time for cache !", ConfigData);
}
}
else
{
// Delete cache if previously existed
CleanCacheStorage(true);
}
}
}
}

View File

@@ -59,7 +59,7 @@ namespace Jackett.Common.Indexers
AddCategoryMapping("53", TorznabCatType.TVSD);
AddCategoryMapping("41", TorznabCatType.TV);
AddCategoryMapping("55", TorznabCatType.TV);
AddCategoryMapping("2", TorznabCatType.TV);
AddCategoryMapping("2", TorznabCatType.TVSD);
AddCategoryMapping("30", TorznabCatType.TVAnime);
AddCategoryMapping("25", TorznabCatType.PCISO);
AddCategoryMapping("39", TorznabCatType.ConsoleWii);

View File

@@ -86,7 +86,7 @@ namespace Jackett.Common.Indexers
var results = await PerformQuery(new TorznabQuery());
if (!results.Any())
{
throw new Exception("Your cookie did not work");
throw new Exception("Your cookie did not work. You might have to change the \"Login Type\" to \"Normal\" in the x264 profile settings.");
}
IsConfigured = true;

View File

@@ -1,10 +1,42 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFrameworks>netstandard2.0;net452;net461</TargetFrameworks>
<TargetFrameworks>netstandard2.0;net461</TargetFrameworks>
<Version>0.0.0</Version>
</PropertyGroup>
<ItemGroup>
<PackageReference Include="AngleSharp" Version="0.9.10" />
<PackageReference Include="Autofac" Version="4.8.1" />
<PackageReference Include="AutoMapper" Version="7.0.1" />
<PackageReference Include="BencodeNET" Version="2.2.24" />
<PackageReference Include="CloudFlareUtilities" Version="1.2.0" />
<PackageReference Include="CommandLineParser" Version="2.3.0" />
<PackageReference Include="DotNet4.SocksProxy" Version="1.4.0.1" />
<PackageReference Include="Microsoft.CSharp" Version="4.5.0" />
<PackageReference Include="MimeMapping" Version="1.0.1.12" />
<PackageReference Include="Newtonsoft.Json" Version="11.0.2" />
<PackageReference Include="NLog" Version="4.5.8" />
<PackageReference Include="YamlDotNet" Version="5.0.1" />
</ItemGroup>
<!-- Conditionally obtain references for the .NET Full framework target -->
<ItemGroup Condition="'$(TargetFramework)' != 'netstandard2.0'">
<PackageReference Include="CsQuery" Version="1.3.5-beta5" />
<PackageReference Include="SharpZipLib" Version="0.86.0" />
<PackageReference Include="Microsoft.AspNetCore.WebUtilities" Version="1.1.2" />
<Reference Include="System.ServiceProcess" />
</ItemGroup>
<!-- Conditionally obtain references for the .NETStandard target -->
<ItemGroup Condition="'$(TargetFramework)' == 'netstandard2.0'">
<PackageReference Include="CsQuery.NETStandard" Version="1.3.6.1" />
<PackageReference Include="SharpZipLib" Version="1.0.0-alpha2" />
<PackageReference Include="System.IO.FileSystem.AccessControl" Version="4.5.0" />
<PackageReference Include="Microsoft.AspNetCore.WebUtilities" Version="2.1.1" />
<PackageReference Include="System.ServiceProcess.ServiceController" Version="4.5.0" />
</ItemGroup>
<ItemGroup>
<Content Include="Content\animate.css">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
@@ -113,22 +145,6 @@
</ItemGroup>
<ItemGroup>
<PackageReference Include="AngleSharp" Version="0.9.9.2" />
<PackageReference Include="Autofac" Version="4.8.1" />
<PackageReference Include="AutoMapper" Version="7.0.1" />
<PackageReference Include="BencodeNET" Version="2.2.24" />
<PackageReference Include="CloudFlareUtilities" Version="1.2.0" />
<PackageReference Include="CommandLineParser" Version="2.2.1" />
<PackageReference Include="DotNet4.SocksProxy" Version="1.4.0.1" />
<PackageReference Include="Microsoft.CSharp" Version="4.5.0" />
<PackageReference Include="MimeMapping" Version="1.0.1.12" />
<PackageReference Include="Newtonsoft.Json" Version="11.0.2" />
<PackageReference Include="NLog" Version="4.5.6" />
<PackageReference Include="YamlDotNet" Version="5.0.1" />
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\CurlSharp\CurlSharp.csproj" />
<ProjectReference Include="..\DateTimeRoutines\DateTimeRoutines.csproj" />
</ItemGroup>
@@ -188,31 +204,6 @@
<ItemGroup>
<Service Include="{508349b6-6b84-4df5-91f0-309beebad82d}" />
</ItemGroup>
<ItemGroup Condition="'$(TargetFramework)' != 'netstandard2.0'">
<PackageReference Include="CsQuery" Version="1.3.5-beta5" />
<PackageReference Include="SharpZipLib" Version="0.86.0" />
<PackageReference Include="Microsoft.AspNetCore.WebUtilities" Version="1.1.2" />
<Reference Include="System.ServiceProcess" />
</ItemGroup>
<ItemGroup Condition="'$(TargetFramework)' == 'netstandard2.0'">
<PackageReference Include="CsQuery.NETStandard">
<Version>1.3.6.1</Version>
</PackageReference>
<PackageReference Include="SharpZipLib">
<Version>1.0.0-alpha2</Version>
</PackageReference>
<PackageReference Include="System.IO.FileSystem.AccessControl">
<Version>4.4.0</Version>
</PackageReference>
<PackageReference Include="Microsoft.AspNetCore.WebUtilities">
<Version>2.0.0</Version>
</PackageReference>
<PackageReference Include="System.ServiceProcess.ServiceController">
<Version>4.5.0</Version>
</PackageReference>
</ItemGroup>
<ItemGroup>
<EmbeddedResource Update="Properties\Resources.resx">
@@ -221,5 +212,4 @@
</EmbeddedResource>
</ItemGroup>
</Project>

View File

@@ -84,16 +84,7 @@ namespace Jackett.Common.Models.Config
if (options.ListenPublic && options.ListenPrivate)
{
Console.WriteLine("You can only use listen private OR listen publicly.");
//TODO: Remove once off Owin
if (EnvironmentUtil.IsRunningLegacyOwin)
{
Engine.Exit(1);
}
else
{
Environment.Exit(1);
}
Environment.Exit(1);
}
// SSL Fix

View File

@@ -0,0 +1,52 @@
namespace Jackett.Common.Models.IndexerConfig.Bespoke
{
internal class ConfigurationDataNordicbits : ConfigurationData
{
public DisplayItem CredentialsWarning { get; private set; }
public StringItem Username { get; private set; }
public StringItem Password { get; private set; }
public DisplayItem PagesWarning { get; private set; }
public StringItem Pages { get; private set; }
public DisplayItem SecurityWarning { get; private set; }
public BoolItem Latency { get; private set; }
public BoolItem Browser { get; private set; }
public DisplayItem LatencyWarning { get; private set; }
public StringItem LatencyStart { get; private set; }
public StringItem LatencyEnd { get; private set; }
public DisplayItem HeadersWarning { get; private set; }
public StringItem HeaderAccept { get; private set; }
public StringItem HeaderAcceptLang { get; private set; }
public BoolItem HeaderDnt { get; private set; }
public BoolItem HeaderUpgradeInsecure { get; private set; }
public StringItem HeaderUserAgent { get; private set; }
public DisplayItem DevWarning { get; private set; }
public BoolItem DevMode { get; private set; }
public BoolItem HardDriveCache { get; private set; }
public StringItem HardDriveCacheKeepTime { get; private set; }
public ConfigurationDataNordicbits()
{
CredentialsWarning = new DisplayItem("<b>Credentials Configuration</b> (<i>Private Tracker</i>),<br /><br /> <ul><li><b>Username</b> is your account name on this tracker.</li><li><b>Password</b> is your password associated to your account name.</li></ul>") { Name = "Credentials" };
Username = new StringItem { Name = "Username (Required)", Value = "" };
Password = new StringItem { Name = "Password (Required)", Value = "" };
PagesWarning = new DisplayItem("<b>Preferences Configuration</b> (<i>Tweak your search settings</i>),<br /><br /> <ul><li><b>Max Pages to Process</b> let you specify how many page (max) Jackett can process when doing a search. Setting a value <b>higher than 4 is dangerous</b> for you account ! (<b>Result of too many requests to tracker...that <u>will be suspect</u></b>).</li></ul>") { Name = "Preferences" };
Pages = new StringItem { Name = "Max Pages to Process (Required)", Value = "4" };
SecurityWarning = new DisplayItem("<b>Security Configuration</b> (<i>Read this area carefully !</i>),<br /><br /> <ul><li><b>Latency Simulation</b> will simulate human browsing with Jacket by pausing Jacket for an random time between each request, to fake a real content browsing.</li><li><b>Browser Simulation</b> will simulate a real human browser by injecting additionals headers when doing requests to tracker.<b>You must enable it to use this provider!</b></li></ul>") { Name = "Security" };
Latency = new BoolItem() { Name = "Latency Simulation (Optional)", Value = false };
Browser = new BoolItem() { Name = "Browser Simulation (Forced)", Value = true };
LatencyWarning = new DisplayItem("<b>Latency Configuration</b> (<i>Required if latency simulation enabled</i>),<br /><br/> <ul><li>By filling this range, <b>Jackett will make a random timed pause</b> <u>between requests</u> to tracker <u>to simulate a real browser</u>.</li><li>MilliSeconds <b>only</b></li></ul>") { Name = "Simulate Latency" };
LatencyStart = new StringItem { Name = "Minimum Latency (ms)", Value = "1589" };
LatencyEnd = new StringItem { Name = "Maximum Latency (ms)", Value = "3674" };
HeadersWarning = new DisplayItem("<b>Browser Headers Configuration</b> (<i>Required if browser simulation enabled</i>),<br /><br /> <ul><li>By filling these fields, <b>Jackett will inject headers</b> with your values <u>to simulate a real browser</u>.</li><li>You can get <b>your browser values</b> here: <a href='https://www.whatismybrowser.com/detect/what-http-headers-is-my-browser-sending' target='blank'>www.whatismybrowser.com</a></li></ul><br /><i><b>Note that</b> some headers are not necessary because they are injected automatically by this provider such as Accept_Encoding, Connection, Host or X-Requested-With</i>") { Name = "Injecting headers" };
HeaderAccept = new StringItem { Name = "Accept", Value = "" };
HeaderAcceptLang = new StringItem { Name = "Accept-Language", Value = "" };
HeaderDnt = new BoolItem { Name = "DNT", Value = false };
HeaderUpgradeInsecure = new BoolItem { Name = "Upgrade-Insecure-Requests", Value = false };
HeaderUserAgent = new StringItem { Name = "User-Agent", Value = "" };
DevWarning = new DisplayItem("<b>Development Facility</b> (<i>For Developers ONLY</i>),<br /><br /> <ul><li>By enabling development mode, <b>Jackett will bypass his cache</b> and will <u>output debug messages to console</u> instead of his log file.</li><li>By enabling Hard Drive Cache, <b>This provider</b> will <u>save each query answers from tracker</u> in temp directory, in fact this reduce drastically HTTP requests when building a provider at parsing step for example. So, <b> Jackett will search for a cached query answer on hard drive before executing query on tracker side !</b> <i>DEV MODE must be enabled to use it !</li></ul>") { Name = "Development" };
DevMode = new BoolItem { Name = "Enable DEV MODE (Developers ONLY)", Value = false };
HardDriveCache = new BoolItem { Name = "Enable HARD DRIVE CACHE (Developers ONLY)", Value = false };
HardDriveCacheKeepTime = new StringItem { Name = "Keep Cached files for (ms)", Value = "300000" };
}
}
}

View File

@@ -71,11 +71,6 @@ namespace Jackett.Common.Plumbing
private void RegisterWebClient<WebClientType>(ContainerBuilder builder)
{
//TODO: Remove once off Owin
if (EnvironmentUtil.IsRunningLegacyOwin)
{
Engine.WebClientType = typeof(WebClientType);
}
builder.RegisterType<WebClientType>().As<WebClient>();
}

View File

@@ -338,16 +338,7 @@ namespace Jackett.Common.Services
}
logger.Info("Exiting Jackett..");
//TODO: Remove once off Owin
if (EnvironmentUtil.IsRunningLegacyOwin)
{
Engine.Exit(0);
}
else
{
Environment.Exit(0);
}
Environment.Exit(0);
}
}
}

View File

@@ -1,167 +0,0 @@
using System;
using System.Linq;
using System.Net;
using System.Text;
using System.Threading.Tasks;
using CloudFlareUtilities;
using CurlSharp;
using CurlSharp.Enums;
using Jackett.Common.Models.Config;
using Jackett.Common.Services.Interfaces;
using NLog;
namespace Jackett.Common.Utils.Clients
{
public class UnixLibCurlWebClient : WebClient
{
public UnixLibCurlWebClient(IProcessService p, Logger l, IConfigurationService c, ServerConfig sc)
: base(p: p,
l: l,
c: c,
sc: sc)
{
}
private string CloudFlareChallengeSolverSolve(string challengePageContent, Uri uri)
{
var solution = ChallengeSolver.Solve(challengePageContent, uri.Host);
string clearanceUri = uri.Scheme + Uri.SchemeDelimiter + uri.Host + ":" + uri.Port + solution.ClearanceQuery;
return clearanceUri;
}
override public void Init()
{
try
{
logger.Info("LibCurl init " + Curl.GlobalInit(CurlInitFlag.All).ToString());
CurlHelper.OnErrorMessage += (msg) =>
{
logger.Error(msg);
};
}
catch (Exception e)
{
logger.Warn("Libcurl failed to initalize. Did you install it?");
logger.Warn("Debian: apt-get install libcurl4-openssl-dev");
logger.Warn("Redhat: yum install libcurl-devel");
throw e;
}
var version = Curl.Version;
logger.Info("LibCurl version " + version);
if (!serverConfig.RuntimeSettings.DoSSLFix.HasValue && version.IndexOf("NSS") > -1)
{
logger.Info("NSS Detected SSL ECC workaround enabled.");
serverConfig.RuntimeSettings.DoSSLFix = true;
}
}
// Wrapper for Run which takes care of CloudFlare challenges, calls RunCurl
override protected async Task<WebClientByteResult> Run(WebRequest request)
{
WebClientByteResult result = await RunCurl(request);
// check if we've received a CloudFlare challenge
string[] server;
if (result.Status == HttpStatusCode.ServiceUnavailable && result.Headers.TryGetValue("server", out server) && (server[0] == "cloudflare-nginx" || server[0] == "cloudflare"))
{
logger.Info("UnixLibCurlWebClient: Received a new CloudFlare challenge");
// solve the challenge
string pageContent = Encoding.UTF8.GetString(result.Content);
Uri uri = new Uri(request.Url);
string clearanceUri = CloudFlareChallengeSolverSolve(pageContent, uri);
logger.Info(string.Format("UnixLibCurlWebClient: CloudFlare clearanceUri: {0}", clearanceUri));
// wait...
await Task.Delay(5000);
// request clearanceUri to get cf_clearance cookie
var response = await CurlHelper.GetAsync(clearanceUri, serverConfig, request.Cookies, request.Referer);
logger.Info(string.Format("UnixLibCurlWebClient: received CloudFlare clearance cookie: {0}", response.Cookies));
// add new cf_clearance cookies to the original request
request.Cookies = response.Cookies + request.Cookies;
// re-run the original request with updated cf_clearance cookie
result = await RunCurl(request);
// add cf_clearance cookie to the final result so we update the config for the next request
result.Cookies = response.Cookies + " " + result.Cookies;
}
return result;
}
protected async Task<WebClientByteResult> RunCurl(WebRequest request)
{
CurlHelper.CurlResponse response;
if (request.Type == RequestType.GET)
{
response = await CurlHelper.GetAsync(request.Url, serverConfig, request.Cookies, request.Referer, request.Headers);
}
else
{
if (!string.IsNullOrEmpty(request.RawBody))
{
logger.Debug("UnixLibCurlWebClient: Posting " + request.RawBody);
}
else if (request.PostData != null && request.PostData.Count() > 0)
{
logger.Debug("UnixLibCurlWebClient: Posting " + StringUtil.PostDataFromDict(request.PostData));
}
response = await CurlHelper.PostAsync(request.Url, serverConfig, request.PostData, request.Cookies, request.Referer, request.Headers, request.RawBody);
}
var result = new WebClientByteResult()
{
Content = response.Content,
Cookies = response.Cookies,
Status = response.Status
};
if (response.HeaderList != null)
{
foreach (var header in response.HeaderList)
{
var key = header[0].ToLowerInvariant();
result.Headers[key] = new string[] { header[1] }; // doesn't support multiple identical headers?
switch (key)
{
case "location":
result.RedirectingTo = header[1];
break;
case "refresh":
if (response.Status == System.Net.HttpStatusCode.ServiceUnavailable)
{
//"Refresh: 8;URL=/cdn-cgi/l/chk_jschl?pass=1451000679.092-1vJFUJLb9R"
var redirval = "";
var value = header[1];
var start = value.IndexOf("=");
var end = value.IndexOf(";");
var len = value.Length;
if (start > -1)
{
redirval = value.Substring(start + 1);
result.RedirectingTo = redirval;
// normally we don't want a serviceunavailable (503) to be a redirect, but that's the nature
// of this cloudflare approach..don't want to alter BaseWebResult.IsRedirect because normally
// it shoudln't include service unavailable..only if we have this redirect header.
result.Status = System.Net.HttpStatusCode.Redirect;
var redirtime = Int32.Parse(value.Substring(0, end));
System.Threading.Thread.Sleep(redirtime * 1000);
}
}
break;
}
}
}
ServerUtil.ResureRedirectIsFullyQualified(request, result);
return result;
}
}
}

View File

@@ -1,179 +0,0 @@
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Text;
using System.Threading.Tasks;
using CurlSharp;
using Jackett.Common.Models.Config;
using Jackett.Common.Services.Interfaces;
using NLog;
namespace Jackett.Common.Utils.Clients
{
public class UnixSafeCurlWebClient : WebClient
{
public UnixSafeCurlWebClient(IProcessService p, Logger l, IConfigurationService c, ServerConfig sc)
: base(p: p,
l: l,
c: c,
sc: sc)
{
}
override public void Init()
{
}
override protected async Task<WebClientByteResult> Run(WebRequest request)
{
var args = new StringBuilder();
var proxy = serverConfig.GetProxyUrl(true);
if (proxy != null)
{
args.AppendFormat("-x '" + proxy + "' ");
}
args.AppendFormat("--url \"{0}\" ", request.Url);
if (request.EmulateBrowser == true)
args.AppendFormat("-i -sS --user-agent \"{0}\" ", BrowserUtil.ChromeUserAgent);
else
args.AppendFormat("-i -sS --user-agent \"{0}\" ", "Jackett/" + configService.GetVersion());
if (!string.IsNullOrWhiteSpace(request.Cookies))
{
args.AppendFormat("--cookie \"{0}\" ", request.Cookies);
}
if (!string.IsNullOrWhiteSpace(request.Referer))
{
args.AppendFormat("--referer \"{0}\" ", request.Referer);
}
if (!string.IsNullOrEmpty(request.RawBody))
{
var postString = StringUtil.PostDataFromDict(request.PostData);
args.AppendFormat("--data \"{0}\" ", request.RawBody.Replace("\"", "\\\""));
}
else if (request.PostData != null && request.PostData.Count() > 0)
{
var postString = StringUtil.PostDataFromDict(request.PostData);
args.AppendFormat("--data \"{0}\" ", postString);
}
var tempFile = Path.GetTempFileName();
args.AppendFormat("--output \"{0}\" ", tempFile);
if (serverConfig.RuntimeSettings.DoSSLFix == true)
{
// http://stackoverflow.com/questions/31107851/how-to-fix-curl-35-cannot-communicate-securely-with-peer-no-common-encryptio
// https://git.fedorahosted.org/cgit/mod_nss.git/plain/docs/mod_nss.html
args.Append("--cipher " + SSLFix.CipherList);
}
if (serverConfig.RuntimeSettings.IgnoreSslErrors == true)
{
args.Append("-k ");
}
args.Append("-H \"Accept-Language: en-US,en\" ");
args.Append("-H \"Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8\" ");
string stdout = null;
await Task.Run(() =>
{
stdout = processService.StartProcessAndGetOutput(System.Environment.OSVersion.Platform == PlatformID.Unix ? "curl" : "curl.exe", args.ToString(), true);
});
var outputData = File.ReadAllBytes(tempFile);
File.Delete(tempFile);
stdout = Encoding.UTF8.GetString(outputData);
var result = new WebClientByteResult();
var headSplit = stdout.IndexOf("\r\n\r\n");
if (headSplit < 0)
throw new Exception("Invalid response");
var headers = stdout.Substring(0, headSplit);
if (serverConfig.RuntimeSettings.ProxyConnection != null)
{
// the proxy provided headers too so we need to split headers again
var headSplit1 = stdout.IndexOf("\r\n\r\n", headSplit + 4);
if (headSplit1 > 0)
{
headers = stdout.Substring(headSplit + 4, headSplit1 - (headSplit + 4));
headSplit = headSplit1;
}
}
var headerCount = 0;
var cookieBuilder = new StringBuilder();
var cookies = new List<Tuple<string, string>>();
foreach (var header in headers.Split(new char[] { '\n', '\r' }, StringSplitOptions.RemoveEmptyEntries))
{
if (headerCount == 0)
{
var responseCode = int.Parse(header.Split(' ')[1]);
result.Status = (HttpStatusCode)responseCode;
}
else
{
var headerSplitIndex = header.IndexOf(':');
if (headerSplitIndex > 0)
{
var name = header.Substring(0, headerSplitIndex).ToLowerInvariant();
var value = header.Substring(headerSplitIndex + 1);
switch (name)
{
case "set-cookie":
var nameSplit = value.IndexOf('=');
if (nameSplit > -1)
{
cookies.Add(new Tuple<string, string>(value.Substring(0, nameSplit), value.Substring(0, value.IndexOf(';') + 1)));
}
break;
case "location":
result.RedirectingTo = value.Trim();
break;
case "refresh":
//"Refresh: 8;URL=/cdn-cgi/l/chk_jschl?pass=1451000679.092-1vJFUJLb9R"
var redirval = "";
var start = value.IndexOf("=");
var end = value.IndexOf(";");
var len = value.Length;
if (start > -1)
{
redirval = value.Substring(start + 1);
result.RedirectingTo = redirval;
// normally we don't want a serviceunavailable (503) to be a redirect, but that's the nature
// of this cloudflare approach..don't want to alter BaseWebResult.IsRedirect because normally
// it shoudln't include service unavailable..only if we have this redirect header.
result.Status = System.Net.HttpStatusCode.Redirect;
var redirtime = Int32.Parse(value.Substring(0, end));
System.Threading.Thread.Sleep(redirtime * 1000);
}
break;
}
}
}
headerCount++;
}
foreach (var cookieGroup in cookies.GroupBy(c => c.Item1))
{
cookieBuilder.AppendFormat("{0} ", cookieGroup.Last().Item2);
}
result.Cookies = cookieBuilder.ToString().Trim();
result.Content = new byte[outputData.Length - (headSplit + 3)];
var dest = 0;
for (int i = headSplit + 4; i < outputData.Length; i++)
{
result.Content[dest] = outputData[i];
dest++;
}
logger.Debug("WebClientByteResult returned " + result.Status);
ServerUtil.ResureRedirectIsFullyQualified(request, result);
return result;
}
}
}

View File

@@ -24,23 +24,5 @@ namespace Jackett.Common.Utils
}
}
public static bool IsRunningLegacyOwin
{
get
{
bool runningOwin;
try
{
runningOwin = AppDomain.CurrentDomain.GetAssemblies().Where(x => x.FullName.StartsWith("Jackett, ")).Any();
}
catch
{
runningOwin = true;
}
return runningOwin;
}
}
}
}

View File

@@ -1,12 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5.2"/>
</startup>
<system.net>
<settings>
<!-- needed to make the broken incapsula DDoS protection work on windows(e.g. for KickAssTorrent), see https://social.technet.microsoft.com/Forums/de-DE/b10b16d1-8eea-4b52-8aeb-f96ea87135fa/sectionresponseheader-detailcr-must-be-followed-by-lf?forum=powerquery -->
<httpWebRequest useUnsafeHeaderParsing="true" />
</settings>
</system.net>
</configuration>

View File

@@ -1,94 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="15.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Import Project="$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props" Condition="Exists('$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props')" />
<PropertyGroup>
<Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
<Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
<ProjectGuid>{4E2A81DA-E235-4A88-AD20-38AABBFBF33C}</ProjectGuid>
<OutputType>Exe</OutputType>
<AppDesignerFolder>Properties</AppDesignerFolder>
<RootNamespace>Jackett.Console</RootNamespace>
<AssemblyName>JackettConsole</AssemblyName>
<TargetFrameworkVersion>v4.5.2</TargetFrameworkVersion>
<FileAlignment>512</FileAlignment>
<AutoGenerateBindingRedirects>true</AutoGenerateBindingRedirects>
<RestoreProjectStyle>PackageReference</RestoreProjectStyle>
<RuntimeIdentifier>win</RuntimeIdentifier>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
<PlatformTarget>AnyCPU</PlatformTarget>
<DebugSymbols>true</DebugSymbols>
<DebugType>full</DebugType>
<Optimize>false</Optimize>
<OutputPath>bin\Debug\</OutputPath>
<DefineConstants>DEBUG;TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
</PropertyGroup>
<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
<PlatformTarget>AnyCPU</PlatformTarget>
<DebugType>pdbonly</DebugType>
<Optimize>true</Optimize>
<OutputPath>bin\Release\</OutputPath>
<DefineConstants>TRACE</DefineConstants>
<ErrorReport>prompt</ErrorReport>
<WarningLevel>4</WarningLevel>
</PropertyGroup>
<PropertyGroup>
<ApplicationIcon>jackett.ico</ApplicationIcon>
</PropertyGroup>
<PropertyGroup>
<StartupObject>Jackett.Console.Program</StartupObject>
</PropertyGroup>
<ItemGroup>
<Reference Include="System" />
<Reference Include="System.Configuration" />
<Reference Include="System.Core" />
<Reference Include="System.IO.Compression" />
<Reference Include="System.Net.Http.WebRequest" />
<Reference Include="System.Runtime.Serialization" />
<Reference Include="System.ServiceModel" />
<Reference Include="System.Transactions" />
<Reference Include="System.Xml.Linq" />
<Reference Include="System.Data.DataSetExtensions" />
<Reference Include="Microsoft.CSharp" />
<Reference Include="System.Data" />
<Reference Include="System.Net.Http" />
<Reference Include="System.Xml" />
</ItemGroup>
<ItemGroup>
<Compile Include="Program.cs" />
<Compile Include="Properties\AssemblyInfo.cs" />
</ItemGroup>
<ItemGroup>
<None Include="App.config" />
<Content Include="install_service_macos">
<CopyToOutputDirectory>PreserveNewest</CopyToOutputDirectory>
</Content>
</ItemGroup>
<ItemGroup>
<ProjectReference Include="..\CurlSharp\CurlSharp.csproj">
<Project>{74420a79-cc16-442c-8b1e-7c1b913844f0}</Project>
<Name>CurlSharp</Name>
</ProjectReference>
<ProjectReference Include="..\Jackett.Common\Jackett.Common.csproj">
<Project>{6B854A1B-9A90-49C0-BC37-9A35C75BCA73}</Project>
<Name>Jackett.Common</Name>
</ProjectReference>
<ProjectReference Include="..\Jackett\Jackett.csproj">
<Project>{e636d5f8-68b4-4903-b4ed-ccfd9c9e899f}</Project>
<Name>Jackett</Name>
</ProjectReference>
</ItemGroup>
<ItemGroup>
<Content Include="jackett.ico" />
</ItemGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<!-- To modify your build process, add your task inside one of the targets below and uncomment it.
Other similar extension points exist, see Microsoft.Common.targets.
<Target Name="BeforeBuild">
</Target>
<Target Name="AfterBuild">
</Target>
-->
</Project>

View File

@@ -1,209 +0,0 @@
using System;
using CommandLine;
using CommandLine.Text;
using Jackett.Common;
using Jackett.Common.Models.Config;
using Jackett.Common.Utils;
using Jackett.Utils;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using System.Linq;
using System.Net;
using System.Reflection;
using System.Text;
using System.Text.RegularExpressions;
using System.Threading;
using System.Threading.Tasks;
namespace Jackett.Console
{
public class Program
{
static void Main(string[] args)
{
var optionsResult = Parser.Default.ParseArguments<ConsoleOptions>(args);
optionsResult.WithNotParsed(errors =>
{
var text = HelpText.AutoBuild(optionsResult);
text.Copyright = " ";
text.Heading = "Jackett v" + EnvironmentUtil.JackettVersion + " options:";
System.Console.WriteLine(text);
Environment.ExitCode = 1;
return;
});
optionsResult.WithParsed(options =>
{
try
{
var runtimeSettings = options.ToRunTimeSettings();
// Initialize autofac, logger, etc. We cannot use any calls to Engine before the container is set up.
Engine.BuildContainer(runtimeSettings, new WebApi2Module());
if (runtimeSettings.LogRequests)
Engine.Logger.Info("Logging enabled.");
if (runtimeSettings.TracingEnabled)
Engine.Logger.Info("Tracing enabled.");
if (runtimeSettings.IgnoreSslErrors == true)
{
Engine.Logger.Info("Jackett will ignore SSL certificate errors.");
}
if (runtimeSettings.DoSSLFix == true)
Engine.Logger.Info("SSL ECC workaround enabled.");
else if (runtimeSettings.DoSSLFix == false)
Engine.Logger.Info("SSL ECC workaround has been disabled.");
// Choose Data Folder
if (!string.IsNullOrWhiteSpace(runtimeSettings.CustomDataFolder))
{
Engine.Logger.Info("Jackett Data will be stored in: " + runtimeSettings.CustomDataFolder);
}
if(!string.IsNullOrEmpty(runtimeSettings.ClientOverride))
{
if (runtimeSettings.ClientOverride != "httpclient" && runtimeSettings.ClientOverride != "httpclient2" && runtimeSettings.ClientOverride != "httpclientnetcore")
{
Engine.Logger.Error($"Client override ({runtimeSettings.ClientOverride}) has been deprecated, please remove it from your start arguments");
Environment.Exit(1);
}
}
// Use Proxy
if (options.ProxyConnection != null)
{
Engine.Logger.Info("Proxy enabled. " + runtimeSettings.ProxyConnection);
}
/* ====== Actions ===== */
// Install service
if (options.Install)
{
Engine.ServiceConfig.Install();
return;
}
// Uninstall service
if (options.Uninstall)
{
Engine.Server.ReserveUrls(doInstall: false);
Engine.ServiceConfig.Uninstall();
return;
}
// Reserve urls
if (options.ReserveUrls)
{
Engine.Server.ReserveUrls(doInstall: true);
return;
}
// Start Service
if (options.StartService)
{
if (!Engine.ServiceConfig.ServiceRunning())
{
Engine.ServiceConfig.Start();
}
return;
}
// Stop Service
if (options.StopService)
{
if (Engine.ServiceConfig.ServiceRunning())
{
Engine.ServiceConfig.Stop();
}
return;
}
// Migrate settings
if (options.MigrateSettings)
{
Engine.ConfigService.PerformMigration();
return;
}
// Show Version
if (options.ShowVersion)
{
System.Console.WriteLine("Jackett v" + EnvironmentUtil.JackettVersion);
return;
}
/* ====== Overrides ===== */
// Override listen public
if (options.ListenPublic || options.ListenPrivate)
{
if (Engine.ServerConfig.AllowExternal != options.ListenPublic)
{
Engine.Logger.Info("Overriding external access to " + options.ListenPublic);
Engine.ServerConfig.AllowExternal = options.ListenPublic;
if (System.Environment.OSVersion.Platform != PlatformID.Unix)
{
if (ServerUtil.IsUserAdministrator())
{
Engine.Server.ReserveUrls(doInstall: true);
}
else
{
Engine.Logger.Error("Unable to switch to public listening without admin rights.");
Engine.Exit(1);
}
}
Engine.SaveServerConfig();
}
}
// Override port
if (options.Port != 0)
{
if (Engine.ServerConfig.Port != options.Port)
{
Engine.Logger.Info("Overriding port to " + options.Port);
Engine.ServerConfig.Port = options.Port;
if (System.Environment.OSVersion.Platform != PlatformID.Unix)
{
if (ServerUtil.IsUserAdministrator())
{
Engine.Server.ReserveUrls(doInstall: true);
}
else
{
Engine.Logger.Error("Unable to switch ports when not running as administrator");
Engine.Exit(1);
}
}
Engine.SaveServerConfig();
}
}
Engine.Server.Initalize();
Engine.Server.Start();
Engine.RunTime.Spin();
Engine.Logger.Info("Server thread exit");
}
catch (Exception e)
{
Engine.Logger.Error(e, "Top level exception");
}
});
}
}
}

View File

@@ -1,35 +0,0 @@
using System.Reflection;
using System.Runtime.InteropServices;
// General Information about an assembly is controlled through the following
// set of attributes. Change these attribute values to modify the information
// associated with an assembly.
[assembly: AssemblyTitle("Jackett.Console")]
[assembly: AssemblyDescription("")]
[assembly: AssemblyConfiguration("")]
[assembly: AssemblyCompany("")]
[assembly: AssemblyProduct("Jackett.Console")]
[assembly: AssemblyCopyright("Copyright © 2015")]
[assembly: AssemblyTrademark("")]
[assembly: AssemblyCulture("")]
// Setting ComVisible to false makes the types in this assembly not visible
// to COM components. If you need to access a type in this assembly from
// COM, set the ComVisible attribute to true on that type.
[assembly: ComVisible(false)]
// The following GUID is for the ID of the typelib if this project is exposed to COM
[assembly: Guid("4e2a81da-e235-4a88-ad20-38aabbfbf33c")]
// Version information for an assembly consists of the following four values:
//
// Major Version
// Minor Version
// Build Number
// Revision
//
// You can specify all the values or you can default the Build and Revision Numbers
// by using the '*' as shown below:
// [assembly: AssemblyVersion("1.0.*")]
[assembly: AssemblyVersion("0.0.0.0")]
[assembly: AssemblyFileVersion("0.0.0.0")]

View File

@@ -1,77 +0,0 @@
#!/bin/bash
#Setting up colors
BOLDRED="$(printf '\033[1;31m')"
BOLDGREEN="$(printf '\033[1;32m')"
NC="$(printf '\033[0m')" # No Color
# Stop and unload the service if it's running
launchctl remove org.user.Jackett
# Move working directory to Jackett's
cd "$(dirname "$0")"
# Check if we're running from Jackett's directory
if [ ! -f ./JackettConsole.exe ]; then
echo "${BOLDRED}ERROR${NC}: Couldn't locate JackettConsole.exe. Is the script in the right directory?"
exit 1
fi
jackettdir="$(pwd)"
# Check if mono is installed
command -v mono >/dev/null 2>&1 || { echo >&2 "${BOLDRED}ERROR${NC}: Jackett requires Mono but it's not installed. Aborting."; exit 1; }
monodir="$(dirname $(command -v mono))"
# Check that no other service called Jackett is already running
if [[ $(launchctl list | grep org.user.Jackett) ]]; then
echo "${BOLDRED}ERROR${NC}: Jackett already seems to be running as a service. Please stop it before running this script again."
exit 1
fi
# Write the plist to LaunchAgents
mkdir -p ~/Library/LaunchAgents/
cat >~/Library/LaunchAgents/org.user.Jackett.plist <<EOL
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>EnvironmentVariables</key>
<dict>
<key>PATH</key>
<string>/usr/bin:/bin:/usr/sbin:/sbin:${monodir}</string>
</dict>
<key>KeepAlive</key>
<true/>
<key>Label</key>
<string>org.user.Jackett</string>
<key>ProgramArguments</key>
<array>
<string>${monodir}/mono</string>
<string>--debug</string>
<string>JackettConsole.exe</string>
<string>--NoRestart</string>
</array>
<key>RunAtLoad</key>
<true/>
<key>WorkingDirectory</key>
<string>${jackettdir}</string>
</dict>
</plist>
EOL
# Run the agent
launchctl load ~/Library/LaunchAgents/org.user.Jackett.plist
# Check that it's running
if [[ $(launchctl list | grep org.user.Jackett) ]]; then
echo "${BOLDGREEN}Agent successfully installed and launched!${NC}"
else
cat << EOL
${BOLDRED}ERROR${NC}: Could not launch agent. The installation might have failed.
Please open an issue on https://github.com/Jackett/Jackett/issues and paste following information:
Mono directory: \`${monodir}\`
Jackett directory: \`${jackettdir}\`
EOL
fi

Binary file not shown.

Before

Width:  |  Height:  |  Size: 298 KiB

View File

@@ -340,11 +340,13 @@ namespace Jackett.Server.Controllers
if (CurrentQuery.ImdbID != null)
{
/* We should allow this (helpful in case of aggregate indexers)
if (!string.IsNullOrEmpty(CurrentQuery.SearchTerm))
{
logger.Warn($"A search request from {Request.HttpContext.Connection.RemoteIpAddress} was made containing q and imdbid.");
return GetErrorXML(201, "Incorrect parameter: please specify either imdbid or q");
}
*/
CurrentQuery.ImdbID = ParseUtil.GetFullImdbID(CurrentQuery.ImdbID); // normalize ImdbID
if (CurrentQuery.ImdbID == null)

View File

@@ -96,8 +96,8 @@ namespace Jackett.Server.Controllers
new ClaimsPrincipal(claimsIdentity),
new AuthenticationProperties
{
ExpiresUtc = DateTime.UtcNow.AddMinutes(20),
IsPersistent = false,
ExpiresUtc = DateTime.UtcNow.AddDays(14), //Cookie expires at end of session
IsPersistent = true,
AllowRefresh = true
});
}

View File

@@ -22,9 +22,9 @@
<ItemGroup>
<PackageReference Include="Autofac" Version="4.8.1" />
<PackageReference Include="Autofac.Extensions.DependencyInjection" Version="4.2.2" />
<PackageReference Include="Autofac.Extensions.DependencyInjection" Version="4.3.0" />
<PackageReference Include="AutoMapper" Version="7.0.1" />
<PackageReference Include="CommandLineParser" Version="2.2.1" />
<PackageReference Include="CommandLineParser" Version="2.3.0" />
<PackageReference Include="Microsoft.AspNetCore" Version="2.1.2" />
<PackageReference Include="Microsoft.AspNetCore.Authentication" Version="2.1.1" />
<PackageReference Include="Microsoft.AspNetCore.Authentication.Cookies" Version="2.1.1" />
@@ -33,9 +33,8 @@
<PackageReference Include="Microsoft.AspNetCore.Rewrite" Version="2.1.1" />
<PackageReference Include="Microsoft.AspNetCore.StaticFiles" Version="2.1.1" />
<PackageReference Include="Microsoft.Extensions.Configuration" Version="2.1.1" />
<PackageReference Include="Microsoft.Extensions.FileProviders.Physical" Version="2.1.1" />
<PackageReference Include="NLog" Version="4.5.6" />
<PackageReference Include="NLog.Web.AspNetCore" Version="4.5.4" />
<PackageReference Include="NLog" Version="4.5.8" />
<PackageReference Include="NLog.Web.AspNetCore" Version="4.6.0" />
<PackageReference Include="System.ServiceProcess.ServiceController" Version="4.5.0" />
<PackageReference Include="System.Text.Encoding.CodePages" Version="4.5.0" />
</ItemGroup>

View File

@@ -49,7 +49,7 @@ namespace Jackett.Server
{
//TODO: Remove libcurl once off owin
bool runningOnDotNetCore = RuntimeInformation.FrameworkDescription.IndexOf("Core", StringComparison.OrdinalIgnoreCase) >= 0;
if (runningOnDotNetCore)
{
options.Client = "httpclientnetcore";
@@ -107,6 +107,7 @@ namespace Jackett.Server
var builder = new ConfigurationBuilder();
builder.AddInMemoryCollection(runtimeDictionary);
builder.AddJsonFile(Path.Combine(configurationService.GetAppDataFolder(), "appsettings.json"), optional: true);
Configuration = builder.Build();
@@ -130,7 +131,10 @@ namespace Jackett.Server
try
{
logger.Debug("Creating web host...");
CreateWebHostBuilder(args, url).Build().Run();
string applicationFolder = Path.Combine(configurationService.ApplicationFolder(), "Content");
logger.Debug($"Content root path is: {applicationFolder}");
CreateWebHostBuilder(args, url, applicationFolder).Build().Run();
}
catch (Exception ex)
{
@@ -174,11 +178,13 @@ namespace Jackett.Server
}
}
public static IWebHostBuilder CreateWebHostBuilder(string[] args, string[] urls) =>
public static IWebHostBuilder CreateWebHostBuilder(string[] args, string[] urls, string contentRoot) =>
WebHost.CreateDefaultBuilder(args)
.UseConfiguration(Configuration)
.UseContentRoot(contentRoot)
.UseWebRoot(contentRoot)
.UseUrls(urls)
.PreferHostingUrls(true)
.UseConfiguration(Configuration)
.UseStartup<Startup>()
.UseNLog();
}

View File

@@ -18,7 +18,6 @@ using Microsoft.AspNetCore.Mvc.Authorization;
using Microsoft.AspNetCore.Rewrite;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.FileProviders;
using Newtonsoft.Json.Serialization;
using System;
using System.IO;
@@ -118,7 +117,10 @@ namespace Jackett.Server
app.UseForwardedHeaders(new ForwardedHeadersOptions
{
ForwardedHeaders = ForwardedHeaders.XForwardedFor | ForwardedHeaders.XForwardedProto
// When adjusting these pareamters make sure it's well tested with various environments
// See https://github.com/Jackett/Jackett/issues/3517
ForwardLimit = 10,
ForwardedHeaders = ForwardedHeaders.XForwardedProto | ForwardedHeaders.XForwardedHost
});
var rewriteOptions = new RewriteOptions()
@@ -128,13 +130,7 @@ namespace Jackett.Server
app.UseRewriter(rewriteOptions);
app.UseFileServer(new FileServerOptions
{
FileProvider = new PhysicalFileProvider(Helper.ConfigService.GetContentFolder()),
RequestPath = "",
EnableDefaultFiles = true,
EnableDirectoryBrowsing = false
});
app.UseStaticFiles();
app.UseAuthentication();

View File

@@ -1,10 +0,0 @@
{
"Logging": {
"IncludeScopes": false,
"LogLevel": {
"Default": "Debug",
"System": "Information",
"Microsoft": "Information"
}
}
}

View File

@@ -1,15 +0,0 @@
{
"Logging": {
"IncludeScopes": false,
"Debug": {
"LogLevel": {
"Default": "Warning"
}
},
"Console": {
"LogLevel": {
"Default": "Warning"
}
}
}
}

View File

@@ -43,7 +43,6 @@
<Reference Include="System.Configuration" />
<Reference Include="System.Configuration.Install" />
<Reference Include="System.Core" />
<Reference Include="System.IO.Compression" />
<Reference Include="System.Net.Http.WebRequest" />
<Reference Include="System.Runtime.Serialization" />
<Reference Include="System.ServiceModel" />
@@ -52,7 +51,6 @@
<Reference Include="System.Data.DataSetExtensions" />
<Reference Include="Microsoft.CSharp" />
<Reference Include="System.Data" />
<Reference Include="System.Net.Http" />
<Reference Include="System.ServiceProcess" />
<Reference Include="System.Xml" />
</ItemGroup>

View File

@@ -2,6 +2,7 @@
<PropertyGroup>
<TargetFramework>net461</TargetFramework>
<PlatformTarget>x86</PlatformTarget>
<IsPackable>false</IsPackable>
</PropertyGroup>
@@ -22,7 +23,8 @@
<ItemGroup>
<PackageReference Include="Autofac" Version="4.8.1" />
<PackageReference Include="FluentAssertions" Version="5.4.1" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="15.7.2" />
<PackageReference Include="Microsoft.AspNetCore.DataProtection" Version="2.1.1" />
<PackageReference Include="Microsoft.NET.Test.Sdk" Version="15.8.0" />
<PackageReference Include="MSTest.TestAdapter" Version="1.3.2" />
<PackageReference Include="MSTest.TestFramework" Version="1.3.2" />
<PackageReference Include="NUnit" Version="3.10.1" />
@@ -36,7 +38,7 @@
<ItemGroup>
<ProjectReference Include="..\Jackett.Common\Jackett.Common.csproj" />
<ProjectReference Include="..\Jackett\Jackett.csproj" />
<ProjectReference Include="..\Jackett.Server\Jackett.Server.csproj" />
</ItemGroup>
<ItemGroup>

View File

@@ -7,21 +7,25 @@ using Jackett.Common.Plumbing;
using Jackett.Common.Models.Config;
using Jackett.Common.Services.Interfaces;
using Jackett.Common.Utils.Clients;
using Microsoft.AspNetCore.DataProtection;
namespace Jackett.Test
{
class TestUtil
static class TestUtil
{
private static IContainer testContainer = null;
private static IContainer testContainer;
public static void SetupContainer()
{
IDataProtectionProvider dataProtectionProvider = new EphemeralDataProtectionProvider();
var builder = new ContainerBuilder();
builder.RegisterModule(new JackettModule(new RuntimeSettings()));
builder.RegisterType<Jackett.Services.ProtectionService>().As<IProtectionService>();
builder.RegisterType<Jackett.Server.Services.ProtectionService>().As<IProtectionService>();
builder.RegisterType<TestWebClient>().As<WebClient>().SingleInstance();
builder.RegisterInstance(LogManager.GetCurrentClassLogger()).SingleInstance();
builder.RegisterType<TestIndexerManagerServiceHelper>().As<IIndexerManagerService>().SingleInstance();
builder.RegisterInstance(dataProtectionProvider).SingleInstance();
testContainer = builder.Build();
}

View File

@@ -110,7 +110,7 @@
</ItemGroup>
<ItemGroup>
<PackageReference Include="CommandLineParser">
<Version>2.2.1</Version>
<Version>2.3.0</Version>
</PackageReference>
</ItemGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />

View File

@@ -1,7 +1,7 @@
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<TargetFrameworks>net452;net461;netcoreapp2.1</TargetFrameworks>
<TargetFrameworks>net461;netcoreapp2.1</TargetFrameworks>
<ApplicationIcon>jackett.ico</ApplicationIcon>
<AssemblyName>JackettUpdater</AssemblyName>
<OutputType>Exe</OutputType>

View File

@@ -232,6 +232,24 @@ namespace Jackett.Updater
"Definitions/tehconnection.yml",
"Definitions/torrentwtf.yml",
"Definitions/eotforum.yml",
"Definitions/nexttorrent.yml",
"appsettings.Development.json",
"appsettings.json",
"CurlSharp.dll",
"CurlSharp.pdb",
"Jackett.dll",
"Jackett.dll.config",
"Jackett.pdb",
"Autofac.Integration.WebApi.dll",
"Microsoft.Owin.dll",
"Microsoft.Owin.FileSystems.dll",
"Microsoft.Owin.Host.HttpListener.dll",
"Microsoft.Owin.Hosting.dll",
"Microsoft.Owin.StaticFiles.dll",
"Owin.dll",
"System.Web.Http.dll",
"System.Web.Http.Owin.dll",
"System.Web.Http.Tracing.dll",
};
foreach (var oldFile in oldFiles)

View File

@@ -2,10 +2,6 @@ Microsoft Visual Studio Solution File, Format Version 12.00
# Visual Studio 15
VisualStudioVersion = 15.0.27004.2008
MinimumVisualStudioVersion = 10.0.40219.1
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Jackett", "Jackett\Jackett.csproj", "{E636D5F8-68B4-4903-B4ED-CCFD9C9E899F}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "CurlSharp", "CurlSharp\CurlSharp.csproj", "{74420A79-CC16-442C-8B1E-7C1B913844F0}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution Items", "{BE7B0C8A-6144-47CD-821E-B09BA1B7BADE}"
ProjectSection(SolutionItems) = preProject
..\appveyor.yml = ..\appveyor.yml
@@ -15,8 +11,6 @@ Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Solution Items", "Solution
..\README.md = ..\README.md
EndProjectSection
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Jackett.Console", "Jackett.Console\Jackett.Console.csproj", "{4E2A81DA-E235-4A88-AD20-38AABBFBF33C}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Jackett.Service", "Jackett.Service\Jackett.Service.csproj", "{BF611F7B-4658-4CB8-AA9E-0736FADAA3BA}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "Jackett.Tray", "Jackett.Tray\Jackett.Tray.csproj", "{FF9025B1-EC14-4AA9-8081-9F69C5E35B63}"
@@ -25,20 +19,10 @@ Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Jackett.Updater", "Jackett.
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Jackett.Test", "Jackett.Test\Jackett.Test.csproj", "{FA22C904-9F5D-4D3C-9122-3E33652E7373}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Vendor", "Vendor", "{7D7FA63C-3C2C-4B56-BD93-8CD28CF44E5D}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "DateTimeRoutines", "DateTimeRoutines\DateTimeRoutines.csproj", "{C28A79EE-EF81-4EEE-A7FE-EB636423C935}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Common", "Common", "{2FA9B879-5882-4B39-8D34-9EBCB82B4F2B}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Jackett.Common", "Jackett.Common\Jackett.Common.csproj", "{6B854A1B-9A90-49C0-BC37-9A35C75BCA73}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = ".NET", ".NET", "{FF8B9A1B-AE7E-4F14-9C37-DA861D034738}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = ".NET Core", ".NET Core", "{6A06EC9B-AF21-4DE8-9B50-BC7E3C2C78B9}"
EndProject
Project("{2150E333-8FDC-42A3-9474-1A3956D46DE8}") = "Executables", "Executables", "{AA50F785-12B8-4669-8D4F-EAFB49258E60}"
EndProject
Project("{9A19103F-16F7-4668-BE54-9A1E7A4F7556}") = "Jackett.Server", "Jackett.Server\Jackett.Server.csproj", "{84182782-EDBC-4342-ADA6-72B7694D0862}"
EndProject
Global
@@ -47,18 +31,6 @@ Global
Release|Any CPU = Release|Any CPU
EndGlobalSection
GlobalSection(ProjectConfigurationPlatforms) = postSolution
{E636D5F8-68B4-4903-B4ED-CCFD9C9E899F}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{E636D5F8-68B4-4903-B4ED-CCFD9C9E899F}.Debug|Any CPU.Build.0 = Debug|Any CPU
{E636D5F8-68B4-4903-B4ED-CCFD9C9E899F}.Release|Any CPU.ActiveCfg = Release|Any CPU
{E636D5F8-68B4-4903-B4ED-CCFD9C9E899F}.Release|Any CPU.Build.0 = Release|Any CPU
{74420A79-CC16-442C-8B1E-7C1B913844F0}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{74420A79-CC16-442C-8B1E-7C1B913844F0}.Debug|Any CPU.Build.0 = Debug|Any CPU
{74420A79-CC16-442C-8B1E-7C1B913844F0}.Release|Any CPU.ActiveCfg = Release|Any CPU
{74420A79-CC16-442C-8B1E-7C1B913844F0}.Release|Any CPU.Build.0 = Release|Any CPU
{4E2A81DA-E235-4A88-AD20-38AABBFBF33C}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{4E2A81DA-E235-4A88-AD20-38AABBFBF33C}.Debug|Any CPU.Build.0 = Debug|Any CPU
{4E2A81DA-E235-4A88-AD20-38AABBFBF33C}.Release|Any CPU.ActiveCfg = Release|Any CPU
{4E2A81DA-E235-4A88-AD20-38AABBFBF33C}.Release|Any CPU.Build.0 = Release|Any CPU
{BF611F7B-4658-4CB8-AA9E-0736FADAA3BA}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{BF611F7B-4658-4CB8-AA9E-0736FADAA3BA}.Debug|Any CPU.Build.0 = Debug|Any CPU
{BF611F7B-4658-4CB8-AA9E-0736FADAA3BA}.Release|Any CPU.ActiveCfg = Release|Any CPU
@@ -91,21 +63,6 @@ Global
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
EndGlobalSection
GlobalSection(NestedProjects) = preSolution
{E636D5F8-68B4-4903-B4ED-CCFD9C9E899F} = {FF8B9A1B-AE7E-4F14-9C37-DA861D034738}
{74420A79-CC16-442C-8B1E-7C1B913844F0} = {7D7FA63C-3C2C-4B56-BD93-8CD28CF44E5D}
{4E2A81DA-E235-4A88-AD20-38AABBFBF33C} = {FF8B9A1B-AE7E-4F14-9C37-DA861D034738}
{BF611F7B-4658-4CB8-AA9E-0736FADAA3BA} = {FF8B9A1B-AE7E-4F14-9C37-DA861D034738}
{FF9025B1-EC14-4AA9-8081-9F69C5E35B63} = {FF8B9A1B-AE7E-4F14-9C37-DA861D034738}
{A61E311A-6F8B-4497-B5E4-2EA8994C7BD8} = {FF8B9A1B-AE7E-4F14-9C37-DA861D034738}
{FA22C904-9F5D-4D3C-9122-3E33652E7373} = {FF8B9A1B-AE7E-4F14-9C37-DA861D034738}
{7D7FA63C-3C2C-4B56-BD93-8CD28CF44E5D} = {2FA9B879-5882-4B39-8D34-9EBCB82B4F2B}
{C28A79EE-EF81-4EEE-A7FE-EB636423C935} = {7D7FA63C-3C2C-4B56-BD93-8CD28CF44E5D}
{6B854A1B-9A90-49C0-BC37-9A35C75BCA73} = {2FA9B879-5882-4B39-8D34-9EBCB82B4F2B}
{FF8B9A1B-AE7E-4F14-9C37-DA861D034738} = {AA50F785-12B8-4669-8D4F-EAFB49258E60}
{6A06EC9B-AF21-4DE8-9B50-BC7E3C2C78B9} = {AA50F785-12B8-4669-8D4F-EAFB49258E60}
{84182782-EDBC-4342-ADA6-72B7694D0862} = {6A06EC9B-AF21-4DE8-9B50-BC7E3C2C78B9}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {54BC4102-8B85-49C1-BA12-257D941D1B97}
EndGlobalSection

View File

@@ -1,8 +0,0 @@
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5.2"/>
</startup>
</configuration>

Some files were not shown because too many files have changed in this diff Show More