Key | Value |
---|---|
FileName | ./usr/share/doc/python3-zarr/html/api/n5.html |
FileSize | 11939 |
MD5 | BF3EF434608244A746BD2D1A56A6D04A |
SHA-1 | 088F657C6117DAF73398973185640231EEE2778A |
SHA-256 | 0984FDC03D2F23C8868CF2A2119E216EDF2D2D7AA1209C1E8C57FC03F7F73B2F |
SSDEEP | 192:0YyuvoXjIPWY7vMyt7fu5Y7u50rQLk8LSecIlIuf7+7P8IxI675LFjLKw717Sc:0nuvoXkPNb5t7fyY7y0rYk8eeS3ec |
TLSH | T1DB3298A1A4F79437003385C3A6EE2B39B1E2916FE4460441B2FD93AC4BDED547907D6E |
hashlookup:parent-total | 1 |
hashlookup:trust | 55 |
The searched file hash is included in 1 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 265776 |
MD5 | D89D83DE08A3F23D035303E5574F7299 |
PackageDescription | chunked, compressed, N-dimensional arrays for Python Zarr is a Python package providing an implementation of compressed, chunked, N-dimensional arrays, designed for use in parallel computing. Some highlights: . - Create N-dimensional arrays with any NumPy dtype. - Chunk arrays along any dimension. - Compress chunks using the fast Blosc meta-compressor or alternatively using zlib, BZ2 or LZMA. - Store arrays in memory, on disk, inside a Zip file, on S3, ... - Read an array concurrently from multiple threads or processes. - Write to an array concurrently from multiple threads or processes. - Organize arrays into hierarchies via groups. - Use filters to preprocess data and improve compression. |
PackageMaintainer | Debian Science Maintainers <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | python3-zarr |
PackageSection | python |
PackageVersion | 2.10.1+ds-1 |
SHA-1 | F0F6DBFCA82D7335F3F13327560F1BC4AA8A0784 |
SHA-256 | 14D5034D986EC88014E4C65FAC599ECCD560CD3501AB21AE259D9E601B9D9239 |