Key | Value |
---|---|
FileName | ./usr/lib/mipsel-linux-gnu/cmake/dlpack/dlpackConfig.cmake |
FileSize | 1353 |
MD5 | FBEB5875FEC2459F05AA1B246D794F75 |
SHA-1 | 200F369E7CD309C8DDB6F84455ECB3D320AC31E7 |
SHA-256 | AF21764EB257AAA85E05D200A07BB80A9A0BE0A46228E3DFF0FB8CFE1658C8F7 |
SSDEEP | 24:73aKk6GFLDEZv6gkekgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VekOh2FZXMbGrzbk2t3 |
TLSH | T1BF2132B427991C609397D6507296B41F08C6557FBEB34440FA8FD28923DD1B05A833FA |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 7072 |
MD5 | 5956C29D7EC10F5B3EA3AAE07F2EE553 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.6-1+b1 |
SHA-1 | 62D3E8F5F0F2B20B214CCDE98C4F5601AC17E608 |
SHA-256 | 7393BD8595328B49AC67764D718D9FE6DB8A943B68D0F57C43B7BB56A2600155 |
Key | Value |
---|---|
FileSize | 6352 |
MD5 | 05848FA7CEC35E3DE50D96129EF6C916 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | CA04AA3B7B11F3E91CE6168BB183A338A1577F58 |
SHA-256 | D30D0E2C08E6C9DAAF843F16AA4DD06D687C2AEB3737062264223416C8A7FAD6 |