• Login
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

    My Account

    LoginRegister

    Statistics

    View Usage StatisticsView Google Analytics Statistics

    Kernel methods for data analysis

    Thumbnail
    View/Open
    201811073.pdf (1.657Mb)
    Date
    2020
    Author
    Dave, Rishi
    Metadata
    Show full item record
    Abstract
    Linear learning machines like perceptrons, support vector machines are potent machines for pattern recognition tasks. They generate hyperplane to separate classes. There are many such real-life problems where such linear separation is not possible. Kernel method(or kernel trick) is the method that adds nonlinearity flavor to these machines. We can increase the computation power of these machines by choosing an appropriate kernel. Kernel methods generally project data into higher dimensions. In this thesis, we show the properties of the kernel methods. We perform experiments on various standard kernels that are available and compare their results on various datasets. We also perform experiments on probabilistic kernel like Kullback-Leibler divergence, which operates on the vector’s set instead of one column feature vectors. In the end, we propose a new kernel Mutual Information kernel and perform experiments on it. We get promising results for new Mutual Information kernel. We use Support Vector Machine(SVM) for the classification task.
    URI
    http://drsr.daiict.ac.in//handle/123456789/979
    Collections
    • M Tech Dissertations [820]

    Resource Centre copyright © 2006-2017 
    Contact Us | Send Feedback
    Theme by 
    Atmire NV
     

     


    Resource Centre copyright © 2006-2017 
    Contact Us | Send Feedback
    Theme by 
    Atmire NV