【Pytorch】Torch_Basic_Learning

前言

本篇为在.ipynb页面上的自学尝试记录,
可以在本人的个人主页上查看或下载自行测试,关于Windows10如何配置Pytorch请移步前篇文章即可~

  • 传送到 个人主页 (Github Pages)
  • 传送到 个人主页 (国内镜像源,加载速度稍快)
  • 传送到 Ipynb的查看或下载

Pytorch

  • Tensor computation (like numpy) with strong GPU acceleration
  • PyTorch is an optimized tensor library for deep learning using GPUs and CPUs.
  • It has a CUDA counterpart, that enables you to run your tensor computations on an NVIDIA GPU with compute capability >= 3.0.
Read More

【Pytorch】Windows10下配置Pytorch环境

0x00 前言

前言什么的也懒得说了……
总之:
听说你Pytorch很牛,
不乐意让我Windows用,
而我又听说pytorch用来训练模型超好用,
不仅没头脑而且不高兴!我要在我的windows上配一个!

Read More

Generative Adversarial Nets

Problem Restatement

This part mainly introduces the principle and realization method of GAN (Generative Adversarial Nets), GAN is proposed by lan.J et al. In 2014, they propose a new framework for estimating generative models via an adversarial process, in which simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D that estimates the probability that a sample came from the training data rather than G. The training procedure for G is to maximize the probability of D making a mistake. This framework corresponds to a minimax two-player game. In the space of arbitrary functions G and D, a unique solution exists, with G recovering the training data distribution and D equal to 1/2 everywhere. In the case where G and D are defined by multilayer perceptron’s, the entire system can be trained with backpropagation. There is no need for any Markov chains or unrolled approximate inference net-works during either training or generation of samples[1].

Read More

【O2OTM】你的搜索透露了你的喜好——跨模态推荐

@(关键词)[跨模态,推荐系统,O2OTM,可解释性,机器学习]

Alt text

论文地址:http://mldm.ict.ac.cn/platform/pweb/academicDetail.htm?id=94

论文翻译:http://blog.csdn.net/okcd00/article/details/51814745

Read More

【Selenium】Windows平台使用python自动登陆网关

0x00 前言

所里开启了两步验证与二级加密(就不说是哪里了);
以前的auto_login用不了了,所以尝试着有没有什么新法子;
看到一个用 Phantomjs + Selenium 的解决方案,着手试试看好了。

Read More