self-supervised classification distillation

转载:深度学习:蒸馏Distill

转载,写的比较好了,可以参考:https://blog.csdn.net/pipisorry/article/details/117257414 Distilling the knowledge in a neural networkHinton 在论文中提出方法很简单,就是让学生模型的预测分布,来 ......
深度 Distill

Supervised Machine Learning : Regression and Classification

The course is available at : Supervised Machine Learning: Regression and Classification - Week 1: Introduction to Machine Learning - Week 1 | Coursera ......

Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods

目录概符号说明Cold Brew代码 Zheng W., Huang E. W., Rao N., Katariya S., Wang Z., Subbian K. Cold brew: Distilling graph node representations with incomplete or ......

论文阅读-Self-supervised and Interpretable Data Cleaning with Sequence Generative Adversarial Networks

1. GARF 简介 代码地址:https://github.com/PJinfeng/Garf-master 基于 SeqGAN 提出了一种自监督、数据驱动的数据清洗框架——GARF。 GARF 的数据清洗分为两个步骤: 规则生成 (Rule generation with SeqGAN):利用 ......

《Progressive Learning of Category-Consistent Multi-Granularity Features for Fine-Grained Visual Classification》阅读笔记

论文标题 《Progressive Learning of Category-Consistent Multi-Granularity Features for Fine-Grained Visual Classification》 细粒度视觉分类中类别一致多粒度特征的渐进学习 作者 Ruoyi D ......

Fine-grained Visual Classification with High-temperature Refinement and Background Suppression

摘要 细粒度视觉分类是一项具有挑战性的任务,因为类别之间的相似性很高,单个类别中数据之间的差异不同。为了应对这些挑战,以前的策略侧重于定位类别之间的细微差异并理解其中的判别特征。然而,背景还提供了重要信息,可以告诉模型哪些特征对于分类是不必要的甚至有害,并且过于依赖细微特征的模型可能会忽略全局特征和 ......

[论文速览] Randomized Quantization@ A Generic Augmentation for Data Agnostic Self-supervised Learning

Pre title: Randomized Quantization: A Generic Augmentation for Data Agnostic Self-supervised Learning accepted: ICCV 2023 paper: https://arxiv.org/abs ......

CA-TCC: 半监督时间序列分类的自监督对比表征学习《Self-supervised Contrastive Representation Learning for Semi-supervised Time-Series Classification》(时间序列、时序表征、时间和上下文对比、对比学习、自监督学习、半监督学习、TS-TCC的扩展版)

现在是2023年11月27日,10:48,今天把这篇论文看了。 论文:Self-supervised Contrastive Representation Learning for Semi-supervised Time-Series Classification GitHub:https://g ......
时间序列 时间 序列 supervised 时序

Multivariate time series classification pytorch lstm

import torch import torch.nn as nn import torch.optim as optim import numpy as np from sklearn.model_selection import train_test_split # 生成示例数据 np.ran ......

Kaggle:Otto Group Classification

Kaggle:Otto Group Classification 数据处理 导入相应的包之后,从csv文件中读取数据,指定id列为index列。本身id列也不携带预测信息。同时将训练数据和测试数据拼接在一起。 train_data = pd.read_csv("dataset/train.csv", ......
Classification Kaggle Group Otto

《A Survey on Deep Learning-based Fine-grained Object Classification and Semantic Segmentation》阅读笔记

论文标题 《A Survey on Deep Learning-based Fine-grained Object Classification and Semantic Segmentation》 基于深度学习的细粒度对象分类和语义分割的综述 为什么是 “Object” 而不是 “image” 作 ......

CART(Classification and Regression Trees)

CART(Classification and Regression Trees)是一种常用的决策树算法,既可以用于分类问题,也可以用于回归问题。CART算法由Breiman等人于1984年提出,是一种基于递归二分划分的贪婪算法。以下是对CART算法的详细解释: 1. 决策树的构建过程: CART算 ......
Classification Regression Trees CART and

基于时间频率一致性对时间序列进行自监督对比预训练《Self-Supervised Contrastive Pre-Training for Time Series via Time-Frequency Consistency》(时序、时频一致性、对比学习)

2023年11月10日,今天看一篇论文,现在17:34,说实话,想摆烂休息,不想看,可还是要看,拴Q。 论文:Self-Supervised Contrastive Pre-Training for Time Series via Time-Frequency Consistency 或者是:Sel ......
一致性 时间序列 时间 时序 Time

Linkless Link Prediction via Relational Distillation

目录概符号说明LLP代码 Guo Z., Shiao W., Zhang S., Liu Y., Chawla N. V., Shah N. and Zhao T. Linkless link prediction via relational distillation. ICML, 2023. 概 ......

Distilling Knowledge from Graph Convolutional Networks

目录概符号说明DistillGCNLocal Structure Preserving代码 Yang Y., Qiu J., Song M., Tao D. and Wang X. Distilling knowledge from graph convolutional networks. CVP ......

Decoupled Knowledge Distillation

目录概符号说明DKD代码 Zhao B., Cui Q., Song R., Qiu Y. and Liang J. Decoupled knowledge distillation. CVPR, 2022. 概 对普通的 KD (Knowledge Distillation) 损失解耦得到 Tar ......
Distillation Decoupled Knowledge

ST-SSL: 用于交通流量预测的时空自监督学习《Spatio-Temporal Self-Supervised Learning for Traffic Flow Prediction》(交通流量预测、自监督)

2023年10月23日,继续论文,好困,想发疯。 论文:Spatio-Temporal Self-Supervised Learning for Traffic Flow Prediction Github:https://github.com/Echo-Ji/ST-SSL AAAI 2023的论文 ......

论文:Ultra Fast Deep Lane Detection with Hybrid Anchor Driven Ordinal Classification-基于anchor方法

论文名: Ultra Fast Deep Lane Detection with Hybrid Anchor Driven Ordinal Classification 混合Anchor驱动顺序分类的超快深车道检测 研究问题: 研究方法: 主要结论: 模型: 问题: 行文结构梳理: Abstrct: ......

神经网络基础篇:详解二分类(Binary Classification)

二分类 注:当实现一个神经网络的时候,通常不直接使用for循环来遍历整个训练集(编程tips) 举例逻辑回归 逻辑回归是一个用于二分类(binary classification)的算法。首先从一个问题开始说起,这里有一个二分类问题的例子,假如有一张图片作为输入,比如这只猫,如果识别这张图片为猫,则 ......

论文阅读:Knowledge Distillation via the Target-aware Transformer

摘要 Knowledge distillation becomes a de facto standard to improve the performance of small neural networks. 知识蒸馏成为提高小型神经网络性能的事实上的标准。 Most of the previo ......

【Kaggle】Spam/Ham Email Classification

基本思想 需求是对垃圾邮件进行分类。 思路1:使用LSTM、GRU等自带的时序模型进行分类。 思路2:使用spacy这个NLP库,里面的textcat可直接用来文本分类 实际上,思路2比思路1更优。由于是入门题,就只使用思路1了。 思路2代码参考:https://blog.csdn.net/qq_2 ......
Classification Kaggle Email Spam Ham

为什么我们需要不断的开发不同的机器学习模型 —— Do we Need Hundreds of Classifiers to Solve Real World Classification Problems?

论文: 《Do we Need Hundreds of Classifiers to Solve Real World Classification Problems?》 论文地址: https://jmlr.org/papers/volume15/delgado14a/delgado14a.pdf ......

Implicit Autoencoder for Point-Cloud Self-Supervised Representation Learning论文阅读

2023 ICCV Implicit Autoencoder for Point-Cloud Self-Supervised Representation Learning论文阅读,思想很妙,该笔记非常简要 ......

(2023年新疆大学、中科院等点云分类最新综述) Deep learning-based 3D point cloud classification: A systematic survey and outlook

目录1、引言2 、3D数据2.1、3D数据表示形式2.2、点云数据存储格式2.3、3D点云公共数据集3 、基于深度学习的点云分类方法3.1、基于多视角的方法3.2、基于体素的方法3.3 、基于点云的方法3.3.1局部特征聚合3.3.1.1基于逐点处理的方法3.3.1.2基于卷积的方法3.3.1.3基 ......

AlexNet模型:ImageNet Classification with Deep Convolutional Neural Networks

文献名:ImageNet Classification with Deep Convolutional Neural Networks 创新点: 首次利用AlexNet神经网络,在ImageNet分类中以巨大的优势打败非神经网络算法 模型: ......

8 Innovative BERT Knowledge Distillation Papers That Have Changed The Landscape of NLP

8 Innovative BERT Knowledge Distillation Papers That Have Changed The Landscape of NLP Contemporary state-of-the-art NLP models are difficult to be ut ......

Chinese-Text-Classification-PyTorch

Chinese-Text-Classification Github项目地址: https://github.com/JackHCC/Chinese-Text-Classification-PyTorch 作者:JackHCC 链接:https://www.jianshu.com/p/9438fd0 ......

《ImageNet Classification with Deep Convolutional Neural Networks》阅读笔记

论文标题 《ImageNet Classification with Deep Convolutional Neural Networks》 ImageNet :经典的划时代的数据集 Deep Convolutional:深度卷积在当时还处于比较少提及的地位,当时主导的是传统机器学习算法 作者 一作 ......

Unbiased Knowledge Distillation for Recommendation

目录概UnKD代码 Chen G., Chen J., Feng F., Zhou S. and He X. Unbiased knowledge distillation for recommendation. WSDM, 2023. 概 考虑流行度偏差的知识蒸馏, 应用于推荐系统. UnKD M ......

Knowledge Distillation from A Stronger Teacher

目录概DIST代码 Huang T., You S., Wang F., Qian C. and Xu C. Knowledge distillation from a stronger teacher. NIPS, 2022. 概 用 Pearson correlation coefficient ......
Distillation Knowledge Stronger Teacher from