Key | Value |
---|---|
FileName | ./usr/lib/python3.7/site-packages/urlgrabber/grabber.py |
FileSize | 98735 |
MD5 | 01150FA6F924E730F53081FC6BB0E37F |
SHA-1 | 5E4469ECFD5E2EB46E2EAB19B6D55453D7C840FD |
SHA-256 | 0E5E999AFB9022F44E8FC5C8BA0D667DD9A833F15A556AED6BFC8E282FEF787F |
SSDEEP | 3072:akN9wJlydxKfVQE0e9awbj7ro2Fkefnfap:akDdxKfVQE0ibj7ro2Sqf6 |
TLSH | T1B3A3F62A4546A23B8723D96A4997E053671DAC1B1E1FB0347CFCC2943F85630D2F6EE9 |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
MD5 | 15F966CD8177B2D5C28BF840A7CC0A73 |
PackageArch | noarch |
PackageDescription | A high-level cross-protocol url-grabber. Using urlgrabber, data can be fetched in three basic ways: urlgrab(url) copy the file to the local filesystem urlopen(url) open the remote file and return a file object (like urllib2.urlopen) urlread(url) return the contents of the file as a string When using these functions (or methods), urlgrabber supports the following features: * identical behavior for http://, ftp://, and file:// urls * http keepalive - faster downloads of many files by using only a single connection * byte ranges - fetch only a portion of the file * reget - for a urlgrab, resume a partial download * progress meters - the ability to report download progress automatically, even when using urlopen! * throttling - restrict bandwidth usage * retries - automatically retry a download if it fails. The number of retries and failure types are configurable. * authenticated server access for http and ftp * proxy support - support for authenticated http and ftp proxies * mirror groups - treat a list of mirrors as a single source, automatically switching mirrors if there is a failure. |
PackageMaintainer | ngompa <ngompa> |
PackageName | python2-urlgrabber |
PackageRelease | 1.mga7 |
PackageVersion | 4.0.0 |
SHA-1 | EFA29E6D97AF7510C4CA48A44372D28D2BE0AB36 |
SHA-256 | D5C0588C5CD16F8B6DCCF2F65C2E31E6B5FAD8C716D3FA9D1FC0F5A4409942E5 |
Key | Value |
---|---|
MD5 | 3353FC07EF6D710A31E22D23619340CA |
PackageArch | noarch |
PackageDescription | A high-level cross-protocol url-grabber. Using urlgrabber, data can be fetched in three basic ways: urlgrab(url) copy the file to the local filesystem urlopen(url) open the remote file and return a file object (like urllib2.urlopen) urlread(url) return the contents of the file as a string When using these functions (or methods), urlgrabber supports the following features: * identical behavior for http://, ftp://, and file:// urls * http keepalive - faster downloads of many files by using only a single connection * byte ranges - fetch only a portion of the file * reget - for a urlgrab, resume a partial download * progress meters - the ability to report download progress automatically, even when using urlopen! * throttling - restrict bandwidth usage * retries - automatically retry a download if it fails. The number of retries and failure types are configurable. * authenticated server access for http and ftp * proxy support - support for authenticated http and ftp proxies * mirror groups - treat a list of mirrors as a single source, automatically switching mirrors if there is a failure. |
PackageMaintainer | ngompa <ngompa> |
PackageName | python3-urlgrabber |
PackageRelease | 1.mga7 |
PackageVersion | 4.0.0 |
SHA-1 | D44E72F41BAA2467C50BCA7625B8A79FFE5A6DBA |
SHA-256 | 1030D6EB84EE9640126645EAA0C7B0668CC90E8CB026CE7EB1E4E997B90C02AF |