Key | Value |
---|---|
FileName | ./usr/share/doc/python3-zarr/html/api/core.html |
FileSize | 155021 |
MD5 | 460888C4D7B1590B02C914CE28C70E84 |
SHA-1 | 1281F05544A087554ED1F61F4ABA204AECC2D6F8 |
SHA-256 | 87F57323B39CC1F6C25699B7857D29E83E5EDBD0D1DFCE7FB3810AE89FCD8BB6 |
SSDEEP | 1536:WQ8P6C/kdCYMmxy5Wfc5q4bt64ldTwWlH5Wzc5q2BX1RUkgXLoROVOg0puP/iD7I:WFWiVt8P3qNjMeDWi/jKKHsOon |
TLSH | T106E3FBA0E1B79133003B99C342FF1B79B1E5942AE0960485A7FDA7BD47DCC50781BA6E |
hashlookup:parent-total | 1 |
hashlookup:trust | 55 |
The searched file hash is included in 1 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 241336 |
MD5 | 59CCD768C4DACD2D9ED9758FE16DD5B4 |
PackageDescription | chunked, compressed, N-dimensional arrays for Python Zarr is a Python package providing an implementation of compressed, chunked, N-dimensional arrays, designed for use in parallel computing. Some highlights: . - Create N-dimensional arrays with any NumPy dtype. - Chunk arrays along any dimension. - Compress chunks using the fast Blosc meta-compressor or alternatively using zlib, BZ2 or LZMA. - Store arrays in memory, on disk, inside a Zip file, on S3, ... - Read an array concurrently from multiple threads or processes. - Write to an array concurrently from multiple threads or processes. - Organize arrays into hierarchies via groups. - Use filters to preprocess data and improve compression. |
PackageMaintainer | Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com> |
PackageName | python3-zarr |
PackageSection | python |
PackageVersion | 2.6.1+ds-1 |
SHA-1 | 14253B3BA86B747D331EB12A47409C136DE4D023 |
SHA-256 | CA6388A803F3F9AC3AA25DA6807BB7D08F58BC9731E916097A88337C63F78963 |