Key | Value |
---|---|
FileName | ./usr/share/doc/python3-zarr/html/py-modindex.html |
FileSize | 6654 |
MD5 | 51FD2EA1693813BB1A6985D75EA5B89A |
SHA-1 | 1EFC6D7161807EC6DF6655FEAFC4D26265E136C4 |
SHA-256 | 02846CB10B7E71656BF6B41A644E18928E9342BD0EC375E412DA03E6277F6EC2 |
SSDEEP | 96:HkEDmLT8WErII+tcirswIvmcurdQPEomZ6IcaG8emMKE3eEscB7Nn8k:QTLqN+Yv1urdQNQ357lZhzCx8k |
TLSH | T162D11111A8F1B13B201281A959907F6EBCC291A7D7676C443CAD4BFB8F41FA05DA734E |
hashlookup:parent-total | 1 |
hashlookup:trust | 55 |
The searched file hash is included in 1 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 246516 |
MD5 | AF586FECE4C07569FC85AD8D88B6F105 |
PackageDescription | chunked, compressed, N-dimensional arrays for Python Zarr is a Python package providing an implementation of compressed, chunked, N-dimensional arrays, designed for use in parallel computing. Some highlights: . - Create N-dimensional arrays with any NumPy dtype. - Chunk arrays along any dimension. - Compress chunks using the fast Blosc meta-compressor or alternatively using zlib, BZ2 or LZMA. - Store arrays in memory, on disk, inside a Zip file, on S3, ... - Read an array concurrently from multiple threads or processes. - Write to an array concurrently from multiple threads or processes. - Organize arrays into hierarchies via groups. - Use filters to preprocess data and improve compression. |
PackageMaintainer | Debian Science Maintainers <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | python3-zarr |
PackageSection | python |
PackageVersion | 2.6.1+ds-1 |
SHA-1 | D8F8B84F73EE02FA5CF27377B07930D2CAD448CD |
SHA-256 | C1308736915EEFB66DC1D4E4E3D5B0FF2B24EDF6B5D2C153964E47FAB5D441E7 |