Key | Value |
---|---|
FileName | ./usr/lib/x86_64-linux-gnu/cmake/dlpack/dlpackTargets.cmake |
FileSize | 3841 |
MD5 | E9FA6E9A582AD420EA7F5D46CE8B9233 |
SHA-1 | 50DB8EB327F9F462F34D8971C715EFCA2BE3EFC7 |
SHA-256 | AF64CC127F7B8ED3A2EF5A7996AF6E2394BD6049F3031C3F887F45E07F6DF0FD |
SSDEEP | 96:ZNz5rU4EhXxmgKT3koZEbie87hPT/L+XmNbUM:5xVj97hPTy2P |
TLSH | T1CC8141661F5B09E003E3D3913A94F51AE051D5BBBF4365A9FC86724C22FC2184A8F27B |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 6396 |
MD5 | DF2C2AB146DBE6A40C8DA1C495E1B42C |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Ubuntu Developers <ubuntu-devel-discuss@lists.ubuntu.com> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | 916CDD13C05AB3C16E3DE2182D5BD01EB1FC6DA8 |
SHA-256 | AAC8276FD3A3B6A10FD51E56603FB2C0FB5BFE72D00D70FA063FAC3E26CB2605 |
Key | Value |
---|---|
FileSize | 6356 |
MD5 | 93B3F9FA3DB176127845B81A01243B31 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | C64BEEECAB8AE3C8CAFE4A51F4419B1F6D05B581 |
SHA-256 | 53D469C0CEF7E62BD216B42B8F6F8BC7712E12810D3A0C235ED70E136BBD0769 |