Neural

论文解读《Interpolated Adversarial Training: Achieving robust neural networks without sacrificing too much accuracy》

论文信息 论文标题:Interpolated Adversarial Training: Achieving robust neural networks without sacrificing too much accuracy论文作者:Alex LambVikas VermaKenji Kawa ......

Handling Information Loss of Graph Neural Networks for Session-based Recommendation

Chen T. and Wong R. C. Handling information loss of graph neural networks for session-based recommendation. KDD, 2020. 概 作者发现图用在 Session 推荐中存在: lossy ......

Spatiotemporal Remote Sensing Image Fusion Using Multiscale Two-Stream Convolutional Neural Networks

Spatiotemporal Remote Sensing Image Fusion Using Multiscale Two-Stream Convolutional Neural Networks abstract 地表反射率图像的渐变和突变是现有STF方法的主要挑战。(Gradual and ......

Do you know the bitwise sum sample demonstrated in "Neural Networks and Deep Learning" by autor Michael Nielsen?

Do you know the bitwise sum sample demonstrated in "Neural Networks and Deep Learning" by autor Michael Nielsen? Yes, I am familiar with the bitwise s ......
quot demonstrated Networks Learning bitwise

迁移学习《Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks》

论文信息 论文标题:Pseudo-Label : The Simple and Efficient Semi-Supervised Learning Method for Deep Neural Networks论文作者:Dong-Hyun Lee论文来源:2013——ICML论文地址:downlo ......

Graph Neural Networks for Link Prediction with Subgraph Sketching

Chamberlain B. P., Shirobokov S., Rossi E., Frasca F., Markovich T., Hammerla N., Bronstein M. M. Hansmire M. Graph neural networks for link predictio ......

卷积神经网络(Convolutional Neural Network)

前置芝士: 神经网络 #前言 人脑视觉机理,是指视觉系统的信息处理在可视皮层是分级的,大脑的工作过程是一个不断迭代、不断抽象的过程。视网膜在得到原始信息后,首先经由区域V1初步处理得到边缘和方向特征信息,其次经由区域V2的进一步抽象得到轮廓和形状特征信息,如此迭代地经由更多更高层的抽象最后得到更为精 ......

Delphi 论文阅读 Delphi: A Cryptographic Inference Service for Neural Networks

摘要 许多公司为用户提供神经网络预测服务,应用范围广泛。然而,目前的预测系统会损害一方的隐私:要么用户必须将敏感输入发送给服务提供商进行分类,要么服务提供商必须将其专有的神经网络存储在用户的设备上。前者损害了用户的个人隐私,而后者暴露了服务提供商的专有模式。 我们设计、实现并评估了DELPHI,这是 ......

Understanding plasticity in neural networks

郑重声明:原文参见标题,如有侵权,请联系作者,将会撤销发布! Arxiv 2023 Abstract 可塑性是神经网络根据新信息快速改变预测的能力,对于深度强化学习系统的适应性和鲁棒性至关重要。众所周知,即使在相对简单的学习问题中,深度神经网络也会在训练过程中失去可塑性,但驱动这种现象的机制仍知之甚 ......
Understanding plasticity networks neural in

neural-network-3b1b-watching-notes

3B1B 观看笔记 Datetime: 2023-03-26T23:20+08:00 Categories: MachineLearning Neural Networks Playlist on Youtube what is mlp? cost function and params gradi ......

Spatio-Temporal Representation With Deep Neural Recurrent Network in MIMO CSI Feedback阅读笔记

阅读文献《Spatio-Temporal Representation With Deep Neural Recurrent Network in MIMO CSI Feedback》 ​ 该文献的作者是天津大学的吴华明老师,在2020年5月发表于IEEE WIRELESS COMMUNICATIO ......

Neural Tangent Kernel (NTK)

A. Jacot, F. Gabriel, and C. Hongler, ‘Neural Tangent Kernel: Convergence and Generalization in Neural Networks’. arXiv, Feb. 10, 2020. Accessed: Mar. ......
Tangent Neural Kernel NTK

Going Deeper With Directly-Trained Larger Spiking Neural Networks

郑重声明:原文参见标题,如有侵权,请联系作者,将会撤销发布! The Thirty-Fifth AAAI Conference on Artificial Intelligence (AAAI-21) Abstract 脉冲神经网络(SNN)在时空信息和事件驱动信号处理的生物可编程编码中很有前途,非 ......

Graph Neural Network——图神经网络

本文是跟着李沐老师的论文精度系列进行GNN的学习的,详细链接请见:零基础多图详解图神经网络(GNN/GCN)【论文精读】 该论文的标题为《A Gentle Introduction to Graph Neural Networks》,是对GNN的简介。那么论文的第一张图呢把鼠标放上去某一个结点将会表 ......
神经网络 神经 Network Neural Graph

【机器学习】李宏毅——Recurrent Neural Network(循环神经网络)

假设我们当前要做一个人工智能客服系统,那该系统就需要对用户输入的话语进行辨认,例如用户输入: I want to arrive Taipei on November 2nd 那么该系统就能够辨认出来Taipei是目的地,而后面是时间。那么我们可以用一个简单的前向网络来实现这个事情,输出为这个单词属于 ......
神经网络 Recurrent 神经 机器 Network
共109篇  :4/4页 首页上一页4下一页尾页