Browse Wiki & Semantic Web

Jump to: navigation, search
Http://dbpedia.org/resource/Neural tangent kernel
  This page has no properties.
hide properties that link here 
  No properties link to this page.
 
http://dbpedia.org/resource/Neural_tangent_kernel
http://dbpedia.org/ontology/abstract No estudo de redes neurais artificiais (RNNo estudo de redes neurais artificiais (RNAs), o kernel de tangente neural (KTN) é um kernel que descreve a evolução de redes neurais artificiais profundas durante seu treinamento por gradiente descendente . Ele permite que RNAs sejam estudadas usando algoritmos do tipo Máquina de vetores de suporte. Para a maioria das arquiteturas de rede neural, no limite da largura da camada, o KTN se torna constante. Isso permite que declarações simples de forma fechada sejam feitas sobre previsões de rede neural, dinâmicas de treinamento, generalização e superfícies de perda. Por exemplo, ele garante que RNAs largas o suficiente convergem para um mínimo global quando treinados para minimizar uma perda empírica. O KTN de redes de grande largura também está relacionado a vários outros limites de largura de redes neurais. O KTN foi lançado em 2018 por Arthur Jacot, Franck Gabriel e Clément Hongler. Também estava implícito em alguns trabalhos contemporâneos.lícito em alguns trabalhos contemporâneos. , In the study of artificial neural networksIn the study of artificial neural networks (ANNs), the neural tangent kernel (NTK) is a kernel that describes the evolution of deep artificial neural networks during their training by gradient descent. It allows ANNs to be studied using theoretical tools from kernel methods. For most common neural network architectures, in the limit of large layer width the NTK becomes constant. This enables simple closed form statements to be made about neural network predictions, training dynamics, generalization, and loss surfaces. For example, it guarantees that wide enough ANNs converge to a global minimum when trained to minimize an empirical loss. The NTK of large width networks is also related to several other large width limits of neural networks. The NTK was introduced in 2018 by Arthur Jacot, Franck Gabriel and Clément Hongler. It was implicit in contemporaneous work on overparameterization.temporaneous work on overparameterization. , En l'estudi de les xarxes neuronals artifiEn l'estudi de les xarxes neuronals artificials (ANN), el kernel de tangent neural (amb acrònim anglès NTK) és un que descriu l'evolució de les xarxes neuronals artificials profundes durant el seu entrenament per descens de gradient. Permet estudiar les ANN utilitzant eines teòriques dels mètodes nucli. Per a les arquitectures de xarxes neuronals més comunes, en el límit de l'amplada de capa gran, l'NTK es torna constant. Això permet fer declaracions senzilles de forma tancada sobre prediccions de xarxes neuronals, dinàmiques d'entrenament, generalització i superfícies de pèrdua. Per exemple, garanteix que les ANN prou amples convergeixen a un mínim global quan s'entrenen per minimitzar una pèrdua empírica. El NTK de les xarxes d'amplada gran també està relacionat amb diversos . El NTK va ser presentat el 2018 per Arthur Jacot, Franck Gabriel i Clément Hongler. Estava implícit en treballs contemporanis sobre sobreparametrització. contemporanis sobre sobreparametrització.
http://dbpedia.org/ontology/wikiPageExternalLink https://www.quantamagazine.org/a-new-link-to-an-old-model-could-crack-the-mystery-of-deep-learning-20211011/%7Cwebsite= + , https://github.com/google/neural-tangents +
http://dbpedia.org/ontology/wikiPageID 62443864
http://dbpedia.org/ontology/wikiPageLength 19653
http://dbpedia.org/ontology/wikiPageRevisionID 1103547779
http://dbpedia.org/ontology/wikiPageWikiLink http://dbpedia.org/resource/Closed-form_expression + , http://dbpedia.org/resource/Loss_function + , http://dbpedia.org/resource/Gaussian_distribution + , http://dbpedia.org/resource/Global_minimum + , http://dbpedia.org/resource/Quanta_Magazine + , http://dbpedia.org/resource/Deep_learning + , http://dbpedia.org/resource/Kernel_regression + , http://dbpedia.org/resource/Free_and_open-source + , http://dbpedia.org/resource/Quadratic_loss_function + , http://dbpedia.org/resource/Artificial_neural_network + , http://dbpedia.org/resource/Python_%28programming_language%29 + , http://dbpedia.org/resource/Category:Kernel_methods_for_machine_learning + , http://dbpedia.org/resource/Independent_and_identically_distributed_random_variables + , http://dbpedia.org/resource/Activation_function + , http://dbpedia.org/resource/Kernel_methods_for_vector_output + , http://dbpedia.org/resource/Large_width_limits_of_neural_networks + , http://dbpedia.org/resource/Kernel_methods + , http://dbpedia.org/resource/Feature_%28machine_learning%29 + , http://dbpedia.org/resource/Maxima_and_minima + , http://dbpedia.org/resource/Recurrent_neural_networks + , http://dbpedia.org/resource/Positive-definite_kernel + , http://dbpedia.org/resource/Convex_function + , http://dbpedia.org/resource/Gaussian_random_variable + , http://dbpedia.org/resource/Convolutional_neural_networks + , http://dbpedia.org/resource/Convolutional_neural_network + , http://dbpedia.org/resource/Neural_network_Gaussian_process + , http://dbpedia.org/resource/Kernel_method + , http://dbpedia.org/resource/Gradient_descent + , http://dbpedia.org/resource/Transformer_%28machine_learning_model%29 + , http://dbpedia.org/resource/Ordinary_differential_equation + , http://dbpedia.org/resource/Affine_transformation + , http://dbpedia.org/resource/Dataset_%28machine_learning%29 +
http://dbpedia.org/property/wikiPageUsesTemplate http://dbpedia.org/resource/Template:Toclimit + , http://dbpedia.org/resource/Template:Reflist + , http://dbpedia.org/resource/Template:Cite_web +
http://purl.org/dc/terms/subject http://dbpedia.org/resource/Category:Kernel_methods_for_machine_learning +
http://www.w3.org/ns/prov#wasDerivedFrom http://en.wikipedia.org/wiki/Neural_tangent_kernel?oldid=1103547779&ns=0 +
http://xmlns.com/foaf/0.1/isPrimaryTopicOf http://en.wikipedia.org/wiki/Neural_tangent_kernel +
owl:sameAs http://dbpedia.org/resource/Neural_tangent_kernel + , http://ca.dbpedia.org/resource/Kernel_de_tangent_neural + , https://global.dbpedia.org/id/BzW1v + , http://www.wikidata.org/entity/Q85788335 + , http://pt.dbpedia.org/resource/Kernel_de_tangente_neural +
rdfs:comment In the study of artificial neural networksIn the study of artificial neural networks (ANNs), the neural tangent kernel (NTK) is a kernel that describes the evolution of deep artificial neural networks during their training by gradient descent. It allows ANNs to be studied using theoretical tools from kernel methods. The NTK was introduced in 2018 by Arthur Jacot, Franck Gabriel and Clément Hongler. It was implicit in contemporaneous work on overparameterization.temporaneous work on overparameterization. , No estudo de redes neurais artificiais (RNNo estudo de redes neurais artificiais (RNAs), o kernel de tangente neural (KTN) é um kernel que descreve a evolução de redes neurais artificiais profundas durante seu treinamento por gradiente descendente . Ele permite que RNAs sejam estudadas usando algoritmos do tipo Máquina de vetores de suporte. O KTN foi lançado em 2018 por Arthur Jacot, Franck Gabriel e Clément Hongler. Também estava implícito em alguns trabalhos contemporâneos.lícito em alguns trabalhos contemporâneos.
rdfs:label Neural tangent kernel , Kernel de tangente neural , Kernel de tangent neural
hide properties that link here 
http://dbpedia.org/resource/NTK + http://dbpedia.org/ontology/wikiPageDisambiguates
http://dbpedia.org/resource/Large_width_limits_of_neural_networks + , http://dbpedia.org/resource/Kernel_method + , http://dbpedia.org/resource/NTK + , http://dbpedia.org/resource/Neural_network_Gaussian_process + http://dbpedia.org/ontology/wikiPageWikiLink
http://en.wikipedia.org/wiki/Neural_tangent_kernel + http://xmlns.com/foaf/0.1/primaryTopic
 

 

Enter the name of the page to start semantic browsing from.