5/23/2022

Garrett van Ryzin talks about optimization

以前教營收管理的時候,讀了不少哥倫比亞商學院 van Ryzin 教授的文章,其中一篇說,他們的研究是在解決10年後的問題。各位注意喔,是商學院

最近在讀一本書,某教授寫的序,開頭四個字,就是學用落差

這讓我想到 van Ryzin,看了一下,跑到了Amazon 。各位可以看一看,這一篇有趣的文章。

Staff writer, How distinguished scientist Garrett van Ryzin is optimizing his time at Amazon, Amazon, October 14, 2020.

Prior to Amazon, van Ryzin was a professor of Operations, Technology and Information Management at Cornell Tech, and previously the Paul M. Montrone Professor of Decision, Risk, and Operations at the Columbia University Graduate School of Business.  His university research work has focused on algorithmic pricing, demand modeling, and stochastic optimization.

van Ryzin was also the head of marketplace optimization at ridesharing companies Lyft and Uber, where he led teams that developed models for a variety of functions, such as optimally dispatching drivers to riders, and developing pricing models and driver pay systems that improve market efficiency. Interestingly, van Ryzin’s paper that he wrote while pursuing his PhD at MIT "A Stochastic and Dynamic Vehicle Routing Problem in the Euclidean Plane” imagined a world of on-demand transportation as far back as 1991. 

The different elements of optimization:  

I’d like to think of optimization being made up of human, technical and operational elements.

Open questions 

From a scientific perspective, there are several open questions in all three elements of market optimization. A fundamental one is determining the best approach to take to develop models to drive efficiency.

One approach is to develop structural models from first principles. For example, you could make an assumption that consumers are utility maximizers, develop a utility function and identify the parameters that constitute this utility function.

You could also take a radically different approach and build models based only on the underlying data – where you draw inferences from what the data alone tells you. Here, you’re not worrying about why something happened. Rather, you can use ideas from machine learning to estimate and refine predictive models without trying to understand the underlying mechanics.

The uncertainty and complexity inherent in large systems 

Approximation is at the heart of optimization because you can never fully represent the full complexity of a real-world trading system. For example, if a consumer places an order on Amazon, you have to make several sequential decisions with complex interactions.  Which fulfillment center should I take that order from? Should I place the items in the same box or should I pack them in different boxes? How will fulfilling this order impact the availability of inventory for the next order that comes in for that product? And how will it affect the available capacity of my local delivery assets?

You can develop approximation models by using a rolling horizon approach. This involves taking a best guess for what the future entails, and then updating your estimate for the future as and when you get new information. Or you could do something that’s far more sophisticated: build simulations of the future, and use sampling techniques to guide your decisions. You can also utilize reinforcement learning where you fit value functions to historical actions to arrive at decisions that are continually refined based on data.

Decomposition is also an important strategy for dealing with the interconnectedness of the different elements of the system. In large systems such as Amazon, everything is related to everything else. Supply affects costs, which affects pricing, which in turn affects demand, which affects dispatch, and so on. Ideally, you’d want to arrive at decisions by taking the whole system into account. However, the size of any real-world system makes this impossible. Any model you arrive at will be too complex, and you’d require a large amount of time to compute anything reasonable.

Jobs at Supply Chain Optimization Technologies (SCOT). 

沒有留言:

張貼留言