Result for 002335328EEECCA592C4C5F125373DCFDB526AC0

Query result

Key Value
FileName./usr/share/doc/shogun/examples/matlab_and_octave/tools/save_as_double.m
FileSize164
MD5035D9C8E58FDEC374BB2EB89A37501CA
SHA-1002335328EEECCA592C4C5F125373DCFDB526AC0
SHA-25605085AECD5C3E1357B67DBAB3D507FDF4255E525527203B41E6CD95A36E21162
SSDEEP3:TMQPazip39AEu6fnXXhAEIzotW11uv0oXt5kVR/FtLSJACE3FhJQEevv:Az+3k6/WoO1uMVLtOt8a3
TLSHT1F4C08C67F9C2B18262B101100045B93DFE047164042BAF8784CA80F8BC3AEA59B03C3F
hashlookup:parent-total6
hashlookup:trust80

Network graph view

Parents (Total: 6)

The searched file hash is included in 6 parent files which include package known and seen by metalookup. A sample is included below:

Key Value
MD5B0C7C23D7EFB37E5F0DA69B08DE1E21A
PackageArchaarch64
PackageDescription The SHOGUN machine learning toolbox's focus is on large scale kernel methods and especially on Support Vector Machines (SVM). It provides a generic SVM object interfacing to several different SVM implementations, among them the state of the art LibSVM. Each of the SVMs can be combined with a variety of kernels. The toolbox not only provides efficient implementations of the most common kernels, like the Linear, Polynomial, Gaussian and Sigmoid Kernel but also comes with a number of recent string kernels as e.g. the Locality Improved, Fischer, TOP, Spectrum, Weighted Degree Kernel (with shifts). For the latter the efficient LINADD optimizations are implemented. Also SHOGUN offers the freedom of working with custom pre-computed kernels. One of its key features is the "combined kernel" which can be constructed by a weighted linear combination of a number of sub-kernels, each of which not necessarily working on the same domain. An optimal sub-kernel weighting can be learned using Multiple Kernel Learning. Currently SVM 2-class classification and regression problems can be dealt with. However SHOGUN also implements a number of linear methods like Linear Discriminant Analysis (LDA), Linear Programming Machine (LPM), (Kernel) Perceptrons and features algorithms to train hidden Markov-models. The input feature-objects can be dense, sparse or strings and of type int/short/double/char and can be converted into different feature types. Chains of "pre-processors" (e.g. subtracting the mean) can be attached to each feature object allowing for on-the-fly pre-processing. This build comes WITHOUT support for Thorsten Joachim's `SVM^light`, because of it's 'no-redistribute', 'no-commercial-use' license. This package contains the Octave-plugin for shogun.
PackageMaintainerFedora Project
PackageNameoctave-shogun
PackageRelease0.33.git20141224.d71e19a.fc22
PackageVersion3.2.0.1
SHA-1CB873A801342D766C9D794B7B584CF42183A8C2B
SHA-2567B32B06B491C1486207F116992E8E2ED28F830BCBD007872AAF12AED1B1CD2A7
Key Value
MD5128A8B8315A034520908AA7FD25443E7
PackageArchaarch64
PackageDescription The SHOGUN machine learning toolbox's focus is on large scale kernel methods and especially on Support Vector Machines (SVM). It provides a generic SVM object interfacing to several different SVM implementations, among them the state of the art LibSVM. Each of the SVMs can be combined with a variety of kernels. The toolbox not only provides efficient implementations of the most common kernels, like the Linear, Polynomial, Gaussian and Sigmoid Kernel but also comes with a number of recent string kernels as e.g. the Locality Improved, Fischer, TOP, Spectrum, Weighted Degree Kernel (with shifts). For the latter the efficient LINADD optimizations are implemented. Also SHOGUN offers the freedom of working with custom pre-computed kernels. One of its key features is the "combined kernel" which can be constructed by a weighted linear combination of a number of sub-kernels, each of which not necessarily working on the same domain. An optimal sub-kernel weighting can be learned using Multiple Kernel Learning. Currently SVM 2-class classification and regression problems can be dealt with. However SHOGUN also implements a number of linear methods like Linear Discriminant Analysis (LDA), Linear Programming Machine (LPM), (Kernel) Perceptrons and features algorithms to train hidden Markov-models. The input feature-objects can be dense, sparse or strings and of type int/short/double/char and can be converted into different feature types. Chains of "pre-processors" (e.g. subtracting the mean) can be attached to each feature object allowing for on-the-fly pre-processing. This build comes WITHOUT support for Thorsten Joachim's `SVM^light`, because of it's 'no-redistribute', 'no-commercial-use' license. This package contains the Octave-plugin for shogun.
PackageMaintainerFedora Project
PackageNameoctave-shogun
PackageRelease0.33.git20141224.d71e19a.fc22
PackageVersion3.2.0.1
SHA-17F1C33791FF1A9AB1D8921E2FF89D437E633E85A
SHA-256D21A17EBDB7F85BF340FF12344A1D0FCECBE64D25E0AB9BA5EB52A39FD296D29
Key Value
MD552FD8EB8EC2348D1E76EF4C3774BD7AE
PackageArchaarch64
PackageDescription The SHOGUN machine learning toolbox's focus is on large scale kernel methods and especially on Support Vector Machines (SVM). It provides a generic SVM object interfacing to several different SVM implementations, among them the state of the art LibSVM. Each of the SVMs can be combined with a variety of kernels. The toolbox not only provides efficient implementations of the most common kernels, like the Linear, Polynomial, Gaussian and Sigmoid Kernel but also comes with a number of recent string kernels as e.g. the Locality Improved, Fischer, TOP, Spectrum, Weighted Degree Kernel (with shifts). For the latter the efficient LINADD optimizations are implemented. Also SHOGUN offers the freedom of working with custom pre-computed kernels. One of its key features is the "combined kernel" which can be constructed by a weighted linear combination of a number of sub-kernels, each of which not necessarily working on the same domain. An optimal sub-kernel weighting can be learned using Multiple Kernel Learning. Currently SVM 2-class classification and regression problems can be dealt with. However SHOGUN also implements a number of linear methods like Linear Discriminant Analysis (LDA), Linear Programming Machine (LPM), (Kernel) Perceptrons and features algorithms to train hidden Markov-models. The input feature-objects can be dense, sparse or strings and of type int/short/double/char and can be converted into different feature types. Chains of "pre-processors" (e.g. subtracting the mean) can be attached to each feature object allowing for on-the-fly pre-processing. This build comes WITHOUT support for Thorsten Joachim's `SVM^light`, because of it's 'no-redistribute', 'no-commercial-use' license. This package contains the ChangeLog, a very detailed documentation, and some great examples for shogun. If you need the Chinese API-docs, you would want to install shogun-doc-cn, too.
PackageMaintainerFedora Project
PackageNameshogun-doc
PackageRelease0.33.git20141224.d71e19a.fc22
PackageVersion3.2.0.1
SHA-16B425709841A847AA3FC7B9406D58797384F2D74
SHA-256C000A7DBAAA5718314D0FD8C3AFA5BC92E7673CEAB57EEBFC678AF86BD6CBE2D
Key Value
MD50C4979D4F74741B3776FB2EE58BBDD8F
PackageArchaarch64
PackageDescriptionThis package contains very detailed documentation, and some great examples for shogun. If you need the Chinese API-docs, you would want to install shogun-doc-cn, too. The Shogun Machine learning toolbox provides a wide range of unified and efficient Machine Learning (ML) methods. The toolbox seamlessly allows to easily combine multiple data representations, algorithm classes, and general purpose tools. This enables both rapid prototyping of data pipelines and extensibility in terms of new algorithms. We combine modern software architecture in C++ with both efficient low-level computing back-ends and cutting edge algorithm implementations to solve large-scale Machine Learning problems (yet) on single machines. One of Shogun's most exciting features is that you can use the toolbox through a unified interface from C++, Python(3), Octave, R, Java, Lua, etc. This not just means that we are independent of trends in computing languages, but it also lets you use Shogun as a vehicle to expose your algorithm to multiple communities. We use SWIG to enable bidirectional communication between C++ and target languages. Shogun runs under Linux/Unix, MacOS, Windows. Originally focusing on large-scale kernel methods and bioinformatics (for a list of scientific papers mentioning Shogun, see here), the toolbox saw massive extensions to other fields in recent years. It now offers features that span the whole space of Machine Learning methods, including many classical methods in classification, regression, dimensionality reduction, clustering, but also more advanced algorithm classes such as metric, multi-task, structured output, and online learning, as well as feature hashing, ensemble methods, and optimization, just to name a few. Shogun in addition contains a number of exclusive state-of-the art algorithms such as a wealth of efficient SVM implementations, Multiple Kernel Learning, kernel hypothesis testing, Krylov methods, etc. All algorithms are supported by a collection of general purpose methods for evaluation, parameter tuning, preprocessing, serialization & I/O, etc; the resulting combinatorial possibilities are huge. The wealth of ML open-source software allows us to offer bindings to other sophisticated libraries including: LibSVM, LibLinear, LibOCAS, libqp, VowpalWabbit, Tapkee, SLEP, GPML and more. Shogun got initiated in 1999 by Soeren Sonnenburg and Gunnar Raetsch (that's where the name ShoGun originates from). It is now developed by a larger team of authors, and would not have been possible without the patches and bug reports by various people. See contributions for a detailed list. Statistics on Shogun's development activity can be found on ohloh.
PackageMaintainerFedora Project
PackageNameshogun-doc
PackageRelease2.fc24
PackageVersion4.1.0
SHA-1B19E3540BFB0874DF32A5AF8145E401CA499F7AF
SHA-256BB0495BEE2B8B9D8D92927920A7DA9B39A42F49760BB1472EBA3C68C16C00E0F
Key Value
MD5B551AF23B2A6D51EB2BE158ACF8E3728
PackageArchaarch64
PackageDescriptionThis package contains the Octave-plugin for shogun. The Shogun Machine learning toolbox provides a wide range of unified and efficient Machine Learning (ML) methods. The toolbox seamlessly allows to easily combine multiple data representations, algorithm classes, and general purpose tools. This enables both rapid prototyping of data pipelines and extensibility in terms of new algorithms. We combine modern software architecture in C++ with both efficient low-level computing back-ends and cutting edge algorithm implementations to solve large-scale Machine Learning problems (yet) on single machines. One of Shogun's most exciting features is that you can use the toolbox through a unified interface from C++, Python(3), Octave, R, Java, Lua, etc. This not just means that we are independent of trends in computing languages, but it also lets you use Shogun as a vehicle to expose your algorithm to multiple communities. We use SWIG to enable bidirectional communication between C++ and target languages. Shogun runs under Linux/Unix, MacOS, Windows. Originally focusing on large-scale kernel methods and bioinformatics (for a list of scientific papers mentioning Shogun, see here), the toolbox saw massive extensions to other fields in recent years. It now offers features that span the whole space of Machine Learning methods, including many classical methods in classification, regression, dimensionality reduction, clustering, but also more advanced algorithm classes such as metric, multi-task, structured output, and online learning, as well as feature hashing, ensemble methods, and optimization, just to name a few. Shogun in addition contains a number of exclusive state-of-the art algorithms such as a wealth of efficient SVM implementations, Multiple Kernel Learning, kernel hypothesis testing, Krylov methods, etc. All algorithms are supported by a collection of general purpose methods for evaluation, parameter tuning, preprocessing, serialization & I/O, etc; the resulting combinatorial possibilities are huge. The wealth of ML open-source software allows us to offer bindings to other sophisticated libraries including: LibSVM, LibLinear, LibOCAS, libqp, VowpalWabbit, Tapkee, SLEP, GPML and more. Shogun got initiated in 1999 by Soeren Sonnenburg and Gunnar Raetsch (that's where the name ShoGun originates from). It is now developed by a larger team of authors, and would not have been possible without the patches and bug reports by various people. See contributions for a detailed list. Statistics on Shogun's development activity can be found on ohloh.
PackageMaintainerFedora Project
PackageNameoctave-shogun
PackageRelease2.fc24
PackageVersion4.1.0
SHA-124EB369C04CAFD1E5769023995AAD7E731196E29
SHA-256342B47FC42D8855927347133DEDE4C9E911C6DE9E621511C5D384F64C9FE1096
Key Value
MD5238979F9A06BCB35AC13A0EC7212902F
PackageArchaarch64
PackageDescription The SHOGUN machine learning toolbox's focus is on large scale kernel methods and especially on Support Vector Machines (SVM). It provides a generic SVM object interfacing to several different SVM implementations, among them the state of the art LibSVM. Each of the SVMs can be combined with a variety of kernels. The toolbox not only provides efficient implementations of the most common kernels, like the Linear, Polynomial, Gaussian and Sigmoid Kernel but also comes with a number of recent string kernels as e.g. the Locality Improved, Fischer, TOP, Spectrum, Weighted Degree Kernel (with shifts). For the latter the efficient LINADD optimizations are implemented. Also SHOGUN offers the freedom of working with custom pre-computed kernels. One of its key features is the "combined kernel" which can be constructed by a weighted linear combination of a number of sub-kernels, each of which not necessarily working on the same domain. An optimal sub-kernel weighting can be learned using Multiple Kernel Learning. Currently SVM 2-class classification and regression problems can be dealt with. However SHOGUN also implements a number of linear methods like Linear Discriminant Analysis (LDA), Linear Programming Machine (LPM), (Kernel) Perceptrons and features algorithms to train hidden Markov-models. The input feature-objects can be dense, sparse or strings and of type int/short/double/char and can be converted into different feature types. Chains of "pre-processors" (e.g. subtracting the mean) can be attached to each feature object allowing for on-the-fly pre-processing. This build comes WITHOUT support for Thorsten Joachim's `SVM^light`, because of it's 'no-redistribute', 'no-commercial-use' license. This package contains the ChangeLog, a very detailed documentation, and some great examples for shogun. If you need the Chinese API-docs, you would want to install shogun-doc-cn, too.
PackageMaintainerFedora Project
PackageNameshogun-doc
PackageRelease0.33.git20141224.d71e19a.fc22
PackageVersion3.2.0.1
SHA-1A34E8EDB663192FBCD8365C7741B89EF5C161B57
SHA-2562D3E44CD953AE0DEF8265D9F11BAB16E687E364AF995F54167E7ECA23226ADBA