Need help with tensor-house?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

ikatsov
312 Stars 178 Forks Apache License 2.0 78 Commits 2 Opened issues

Description

A collection of reference machine learning and optimization models for enterprise operations: marketing, pricing, supply chain

Services available

!
?

Need anything else?

Contributors list

About

TensorHouse is a collection of reference machine learning and optimization models for enterprise operations: marketing, pricing, supply chain, and more. The goal of the project is to provide baseline implementations for industrial, research, and educational purposes.

The project focuses on models, techniques, and datasets that were originally developed either by industry practitioners or by academic researchers who worked in collaboration with leading companies in technology, retail, manufacturing, and other sectors. In other words, TensorHouse focuses mainly on industry-proven methods and models rather than on theoretical research.

TensorHouse contains the following resources: * a well-documented repository of reference model implementations, * a manually curated list of important papers in modern operations research, * a manually curated list of public datasets related to enterprise use cases.

Illustrative Examples

Strategic price optimization using reinforcement learning: \ DQN learns a Hi-Lo pricing policy that switches between regular and discounted prices Price Optimization Using RL Animation

Supply chain optimization using reinforcement learning: \ World Of Supply simulation environment Price Optimization Using RL Animation

Demand decomposition using Bayesian Structural Time Series Demand Decomposition Example

List of Models

  • Promotions and Advertisements
    • Media Mix Modeling: Basic Addstock Model for Campaign/Channel Attribution
    • Media Mix Modeling: Bayesian Model with Carriover and Saturation Effects
    • Dynamic Content Personalization using Contextual Bandits (LinUCB)
    • Customer Lifetime Value (LTV) Modeling using Markov Chain
    • Next Best Action Model using Reinforcement Learning (Fitted Q Iteration)
    • Multi-touch Channel Attribution Model using Deep Learning (LSTM with Attention)
    • Customer Churn Analysis and Prediction using Deep Learning (LSTM with Attention)
  • Search
    • Latent Semantic Analysis (LSA)
    • Image Search by Artistic Style (VGG16)
  • Recommendations
    • Nearest Neighbor User-based Collaborative Filtering
    • Nearest Neighbor Item-based Collaborative Filtering
    • Item2Vec Model using NLP Methods (word2vec)
    • Customer2Vec Model using NLP Methods (doc2vec)
  • Pricing and Assortment
    • Markdown Price Optimization
    • Dynamic Pricing using Thompson Sampling
    • Dynamic Pricing with Limited Price Experimentation
    • Price Optimization using Reinforcement Learning (DQN)
  • Supply Chain
    • Multi-echelon Inventory Optimization using Reinforcement Learning (DDPG, TD3)
    • Supply Chain Simulator for Reinforcement Learning Based Optimization (PPO)
  • Enterpirse Time Series Analysis
    • Demand Forecasting Using ARIMA and SARIMA
    • Demand Decomposition and Forecasting using Bayesian Structural Time Series (BSTS)
    • Forecasting and Decomposition using Gradient Boosted Decision Trees (GBDT)
    • Forecasting and Decomposition using LSTM with Attention
    • Forecasting and Decomposition using VAR/VEC models

Approach

  • The most basic models come from Introduction to Algorithmic Marketing book. Book's website - https://algorithmicweb.wordpress.com/
  • More advanced models use deep learning techniques to analyze event sequences (e.g. clickstream) and reinforcement learning for optimization (e.g. safety stock management policy)
  • Almost all models are based on industrial reports and real-life case studies

Community

Follow our twitter feed for notifications about meetups and new developments.

Twitter Follow

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.