Browse Wiki & Semantic Web

Jump to: navigation, search
Http://dbpedia.org/resource/Stochastic gradient Langevin dynamics
  This page has no properties.
hide properties that link here 
  No properties link to this page.
 
http://dbpedia.org/resource/Stochastic_gradient_Langevin_dynamics
http://dbpedia.org/ontology/abstract Stochastic gradient Langevin dynamics (SGLStochastic gradient Langevin dynamics (SGLD) is an optimization and sampling technique composed of characteristics from Stochastic gradient descent, a Robbins–Monro optimization algorithm, and Langevin dynamics, a mathematical extension of molecular dynamics models. Like stochastic gradient descent, SGLD is an iterative optimization algorithm which uses minibatching to create a stochastic gradient estimator, as used in SGD to optimize a differentiable objective function. Unlike traditional SGD, SGLD can be used for Bayesian learning as a sampling method. SGLD may be viewed as Langevin dynamics applied to posterior distributions, but the key difference is that the likelihood gradient terms are minibatched, like in SGD. SGLD, like Langevin dynamics, produces samples from a posterior distribution of parameters based on available data. First described by Welling and Teh in 2011, the method has applications in many contexts which require optimization, and is most notably applied in machine learning problems.ably applied in machine learning problems.
http://dbpedia.org/ontology/thumbnail http://commons.wikimedia.org/wiki/Special:FilePath/Non-Convex_Objective_Function.gif?width=300 +
http://dbpedia.org/ontology/wikiPageID 58878004
http://dbpedia.org/ontology/wikiPageLength 9230
http://dbpedia.org/ontology/wikiPageRevisionID 1107781335
http://dbpedia.org/ontology/wikiPageWikiLink http://dbpedia.org/resource/Deep_learning + , http://dbpedia.org/resource/Artificial_neural_network + , http://dbpedia.org/resource/Lattice_field_theory + , http://dbpedia.org/resource/Hamiltonian_Monte_Carlo + , http://dbpedia.org/resource/Bayesian_inference + , http://dbpedia.org/resource/Category:Computational_statistics + , http://dbpedia.org/resource/Molecular_dynamics + , http://dbpedia.org/resource/Category:Gradient_methods + , http://dbpedia.org/resource/Langevin_dynamics + , http://dbpedia.org/resource/Stochastic_gradient_descent + , http://dbpedia.org/resource/Differentiable_function + , http://dbpedia.org/resource/Metropolis%E2%80%93Hastings_algorithm + , http://dbpedia.org/resource/Robbins%E2%80%93Monro_algorithm + , http://dbpedia.org/resource/Category:Optimization_algorithms_and_methods + , http://dbpedia.org/resource/Critical_point_%28mathematics%29 + , http://dbpedia.org/resource/Mathematical_optimization + , http://dbpedia.org/resource/File:Non-Convex_Objective_Function.gif + , http://dbpedia.org/resource/Gradient_descent + , http://dbpedia.org/resource/Convex_set + , http://dbpedia.org/resource/Total_variation + , http://dbpedia.org/resource/Objective_function + , http://dbpedia.org/resource/Category:Stochastic_optimization + , http://dbpedia.org/resource/Condition_number +
http://dbpedia.org/property/wikiPageUsesTemplate http://dbpedia.org/resource/Template:Orphan + , http://dbpedia.org/resource/Template:Citation_needed +
http://purl.org/dc/terms/subject http://dbpedia.org/resource/Category:Computational_statistics + , http://dbpedia.org/resource/Category:Stochastic_optimization + , http://dbpedia.org/resource/Category:Gradient_methods + , http://dbpedia.org/resource/Category:Optimization_algorithms_and_methods +
http://www.w3.org/ns/prov#wasDerivedFrom http://en.wikipedia.org/wiki/Stochastic_gradient_Langevin_dynamics?oldid=1107781335&ns=0 +
http://xmlns.com/foaf/0.1/depiction http://commons.wikimedia.org/wiki/Special:FilePath/Non-Convex_Objective_Function.gif +
http://xmlns.com/foaf/0.1/isPrimaryTopicOf http://en.wikipedia.org/wiki/Stochastic_gradient_Langevin_dynamics +
owl:sameAs https://global.dbpedia.org/id/9NGyb + , http://dbpedia.org/resource/Stochastic_gradient_Langevin_dynamics + , http://www.wikidata.org/entity/Q60750312 +
rdfs:comment Stochastic gradient Langevin dynamics (SGLStochastic gradient Langevin dynamics (SGLD) is an optimization and sampling technique composed of characteristics from Stochastic gradient descent, a Robbins–Monro optimization algorithm, and Langevin dynamics, a mathematical extension of molecular dynamics models. Like stochastic gradient descent, SGLD is an iterative optimization algorithm which uses minibatching to create a stochastic gradient estimator, as used in SGD to optimize a differentiable objective function. Unlike traditional SGD, SGLD can be used for Bayesian learning as a sampling method. SGLD may be viewed as Langevin dynamics applied to posterior distributions, but the key difference is that the likelihood gradient terms are minibatched, like in SGD. SGLD, like Langevin dynamics, produces samples from a posterior distribuproduces samples from a posterior distribu
rdfs:label Stochastic gradient Langevin dynamics
hide properties that link here 
http://dbpedia.org/resource/Stochastic_Gradient_Langevin_Dynamics + http://dbpedia.org/ontology/wikiPageWikiLink
http://en.wikipedia.org/wiki/Stochastic_gradient_Langevin_dynamics + http://xmlns.com/foaf/0.1/primaryTopic
http://dbpedia.org/resource/Stochastic_gradient_Langevin_dynamics + owl:sameAs
 

 

Enter the name of the page to start semantic browsing from.