Result for CB429959E0494A0B3F53CFC03F4DA8E947D7C8F5

Query result

Key Value
FileName./usr/lib/mips64el-linux-gnuabi64/cmake/dlpack/dlpackConfig.cmake
FileSize1360
MD5DAC0B400012281DB3562BFC89288371D
SHA-1CB429959E0494A0B3F53CFC03F4DA8E947D7C8F5
SHA-25649CEB46DED555CB4EAA5A6704AB5626E9CB2AB959952F1192F7E114A5E0731E0
SSDEEP24:73aKk6GFLDEZv6gkeigJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VeiOh2FZXMbGrzbk2t3
TLSHT1DB2110B427A91C609397D55062A6B41F08CA457FBEB34440FA8FD28923DD1B05A832BA
hashlookup:parent-total2
hashlookup:trust60

Network graph view

Parents (Total: 2)

The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
FileSize7092
MD563E984CE97E2F9B1F293C965B5BFF597
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.6-1+b1
SHA-1093EAAC2DD932A24A4BB40492C464C06EC981F38
SHA-256E0D33587E4DA87F77BF78E327F19591EBDC71B1FFC22160F4690EB10D3FBF594
Key Value
FileSize6360
MD5B1BC5B189455B6FBB665A07D995A4DDD
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.0~git20200217.3ec0443-2
SHA-10E30A7C1D69B43E6DD4850CFD20F190C5A2015A5
SHA-2560A9C0663E0D28405488A99660D3826F60E4492A0BDDD3154ED21386132CBB7CB