Key | Value |
---|---|
FileName | ./usr/share/doc/python3-zarr/html/api/attrs.html |
FileSize | 15566 |
MD5 | 171E0FA1EB601C933C39CB6902D21C8C |
SHA-1 | 0EDCD84DDCFACE327C43A2F78F75F3BF78836406 |
SHA-256 | 804A2C5638B887AC4958ACF925A88FABBEB31FB9F458BF1CBD828F1D6453118D |
SSDEEP | 384:0dvoXkkGbs67MyT0yYhyTfyFbAEGbb6pbOGFbvlGsUpZre12CsSYzNHo1qXNKc:GLkGbs67MyT0yYhyTfyFbAJb6pbOGFb8 |
TLSH | T12562DDB284F25533193381DADAEA1B3AB0DAC05EE0410C59F6FC53AA87CFD84B95785D |
hashlookup:parent-total | 1 |
hashlookup:trust | 55 |
The searched file hash is included in 1 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 265776 |
MD5 | D89D83DE08A3F23D035303E5574F7299 |
PackageDescription | chunked, compressed, N-dimensional arrays for Python Zarr is a Python package providing an implementation of compressed, chunked, N-dimensional arrays, designed for use in parallel computing. Some highlights: . - Create N-dimensional arrays with any NumPy dtype. - Chunk arrays along any dimension. - Compress chunks using the fast Blosc meta-compressor or alternatively using zlib, BZ2 or LZMA. - Store arrays in memory, on disk, inside a Zip file, on S3, ... - Read an array concurrently from multiple threads or processes. - Write to an array concurrently from multiple threads or processes. - Organize arrays into hierarchies via groups. - Use filters to preprocess data and improve compression. |
PackageMaintainer | Debian Science Maintainers <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | python3-zarr |
PackageSection | python |
PackageVersion | 2.10.1+ds-1 |
SHA-1 | F0F6DBFCA82D7335F3F13327560F1BC4AA8A0784 |
SHA-256 | 14D5034D986EC88014E4C65FAC599ECCD560CD3501AB21AE259D9E601B9D9239 |