Key | Value |
---|---|
FileName | ./usr/lib/mips64el-linux-gnuabi64/cmake/dlpack/dlpackConfig.cmake |
FileSize | 1360 |
MD5 | DAC0B400012281DB3562BFC89288371D |
SHA-1 | CB429959E0494A0B3F53CFC03F4DA8E947D7C8F5 |
SHA-256 | 49CEB46DED555CB4EAA5A6704AB5626E9CB2AB959952F1192F7E114A5E0731E0 |
SSDEEP | 24:73aKk6GFLDEZv6gkeigJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeiOh2FZXMbGrzbk2t3 |
TLSH | T1DB2110B427A91C609397D55062A6B41F08CA457FBEB34440FA8FD28923DD1B05A832BA |
hashlookup:parent-total | 2 |
hashlookup:trust | 60 |
The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:
Key | Value |
---|---|
FileSize | 7092 |
MD5 | 63E984CE97E2F9B1F293C965B5BFF597 |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.6-1+b1 |
SHA-1 | 093EAAC2DD932A24A4BB40492C464C06EC981F38 |
SHA-256 | E0D33587E4DA87F77BF78E327F19591EBDC71B1FFC22160F4690EB10D3FBF594 |
Key | Value |
---|---|
FileSize | 6360 |
MD5 | B1BC5B189455B6FBB665A07D995A4DDD |
PackageDescription | Open In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks. |
PackageMaintainer | Debian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org> |
PackageName | libdlpack-dev |
PackageSection | science |
PackageVersion | 0.0~git20200217.3ec0443-2 |
SHA-1 | 0E30A7C1D69B43E6DD4850CFD20F190C5A2015A5 |
SHA-256 | 0A9C0663E0D28405488A99660D3826F60E4492A0BDDD3154ED21386132CBB7CB |