Result for 200F369E7CD309C8DDB6F84455ECB3D320AC31E7

Query result

Key Value
FileName./usr/lib/mipsel-linux-gnu/cmake/dlpack/dlpackConfig.cmake
FileSize1353
MD5FBEB5875FEC2459F05AA1B246D794F75
SHA-1200F369E7CD309C8DDB6F84455ECB3D320AC31E7
SHA-256AF21764EB257AAA85E05D200A07BB80A9A0BE0A46228E3DFF0FB8CFE1658C8F7
SSDEEP24:73aKk6GFLDEZv6gkekgJXR2FZ1j5X0wQbG0qQqz2K5aG2cRtzW2v:7XGpwV6VekOh2FZXMbGrzbk2t3
TLSHT1BF2132B427991C609397D6507296B41F08C6557FBEB34440FA8FD28923DD1B05A833FA
hashlookup:parent-total2
hashlookup:trust60

Network graph view

Parents (Total: 2)

The searched file hash is included in 2 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
FileSize7072
MD55956C29D7EC10F5B3EA3AAE07F2EE553
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.6-1+b1
SHA-162D3E8F5F0F2B20B214CCDE98C4F5601AC17E608
SHA-2567393BD8595328B49AC67764D718D9FE6DB8A943B68D0F57C43B7BB56A2600155
Key Value
FileSize6352
MD505848FA7CEC35E3DE50D96129EF6C916
PackageDescriptionOpen In Memory Tensor Structure DLPack is an open in-memory tensor structure to for sharing tensor among frameworks. DLPack enables . * Easier sharing of operators between deep learning frameworks. * Easier wrapping of vendor level operator implementations, allowing collaboration when introducing new devices/ops. * Quick swapping of backend implementations, like different version of BLAS * For final users, this could bring more operators, and possibility of mixing usage between frameworks. . DLPack do not intend to implement of Tensor and Ops, but instead use this as common bridge to reuse tensor and ops across frameworks.
PackageMaintainerDebian Deep Learning Team <debian-science-maintainers@lists.alioth.debian.org>
PackageNamelibdlpack-dev
PackageSectionscience
PackageVersion0.0~git20200217.3ec0443-2
SHA-1CA04AA3B7B11F3E91CE6168BB183A338A1577F58
SHA-256D30D0E2C08E6C9DAAF843F16AA4DD06D687C2AEB3737062264223416C8A7FAD6