Key | Value |
---|---|
FileName | ./usr/share/doc/python3-zarr/html/_sources/release.rst.txt |
FileSize | 37284 |
MD5 | D98C7B00C508D6BD946477AA6F83EAFF |
SHA-1 | 145A90075234D9C2FFD6D0202E4DC9BAD99E15A4 |
SHA-256 | A780BB324F2C70E24D1FA7849C645372B0A20AD2E4B440ABF3EB4968033FBEC6 |
SSDEEP | 768:iFoP1wvicTeHJoEL939D9dGx7ucd89agKX2037tkvX496:GiXHPLx9A7ucGkjXRxkg96 |
TLSH | T1F8F2A42E722C2F724553053299943DCBB759C0AC7325B608487E931C163E935BFBBBA9 |
hashlookup:parent-total | 1 |
hashlookup:trust | 55 |
The searched file hash is included in 1 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 265776 |
MD5 | D89D83DE08A3F23D035303E5574F7299 |
PackageDescription | chunked, compressed, N-dimensional arrays for Python Zarr is a Python package providing an implementation of compressed, chunked, N-dimensional arrays, designed for use in parallel computing. Some highlights: . - Create N-dimensional arrays with any NumPy dtype. - Chunk arrays along any dimension. - Compress chunks using the fast Blosc meta-compressor or alternatively using zlib, BZ2 or LZMA. - Store arrays in memory, on disk, inside a Zip file, on S3, ... - Read an array concurrently from multiple threads or processes. - Write to an array concurrently from multiple threads or processes. - Organize arrays into hierarchies via groups. - Use filters to preprocess data and improve compression. |
PackageMaintainer | Debian Science Maintainers <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | python3-zarr |
PackageSection | python |
PackageVersion | 2.10.1+ds-1 |
SHA-1 | F0F6DBFCA82D7335F3F13327560F1BC4AA8A0784 |
SHA-256 | 14D5034D986EC88014E4C65FAC599ECCD560CD3501AB21AE259D9E601B9D9239 |