Key | Value |
---|---|
FileName | ./usr/lib/aarch64-linux-gnu/cmake/dlpack/dlpackConfig.cmake |
FileSize | 1354 |
MD5 | 864ACFA6A0E3145F62B4CE6BDA970193 |
SHA-1 | A9C0F43811BD9AF9FEC053BA4FB0076907246B37 |
SHA-256 | 58C2D5A9AA072E086302925F9C562E7EC97CB2BA8862DAFDFD7EED6FE1DAC7B3 |
SSDEEP | 24:73aKk6GFLDEZv6gkeFgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeFOh2FZXMbGrzbk2t3 |
TLSH | T1292132B427991C608397C1507296B51F08C6547FBEB34840FA8FD29923DD1B05A833FA |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 6812 |
MD5 | D99C82BB07463753B015B833465C0F0F |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.6-1 |
SHA-1 | 80B4FD317857099C027C7036636664BAD3D6F8CE |
SHA-256 | 163FFF4B35EFC00920EF637827488218D45E0B0440BD2F803951F10FD27B87E1 |
Key | Value |
---|---|
FileSize | 6348 |
MD5 | CE1D5B8C5D3FB3342E7F9E9F2A26E6B6 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | 62751B97167FBEE43A64CFD142D55814AE7C76FD |
SHA-256 | BFD28E2D0C53B2222B58391733F7DECA4FF99A77D2D07EA4D4B42ADB5EB9500E |