顯示具有 Julia 標籤的文章。 顯示所有文章
顯示具有 Julia 標籤的文章。 顯示所有文章

8/13/2025

研究 (Research)

Conference papers:
  • C.-H. Hsu and T.-Y. Liao,  Enhanced holistic regression for multicollinearity detection and feature selection, Available at SSRN, 2025. (code)
  • 廖庭煜、維琪、許志華、饒忻,增強不確定性下的決策:結合 TRIZ 和機器學習方法的穩健優化框架,2025 系統性創新研討會暨專案競賽,論文競賽獎第一名 (code)
Journal articles:

6/11/2025

Guides for students in Business Analytics Laboratory (商業分析實驗室學生指引)

Knowledge to master for a better foundation (and future)

Tools and general: 

1/24/2024

Applications of Operations Research (作業研究) (including Optimization)

為了提高同學們的學習動機,提供以下相關的資訊,以幫助同學們找到方向。也和暑期實習和未來就業中,決策支援系統中的演算法有密切關聯。以下許多的內容屬於碩博士階段的課程,也可以增加同學們就讀研究所的動機:

  • Journals: 
    • INFORMS Journal on Applied Analytics
      • INFORMS is the leading international association for Operations Research & Analytics professionals.
      • The mission of INFORMS Journal on Applied Analytics is to publish manuscripts focusing on the practice of operations research and management science and the impact this practice has on organizations throughout the world
      • Good topics to be explored for the final project
    • Ramayya Krishnan and Pascal Van Hentenryck, editors, Advances in Integrating AI & O.R.INFORMS EC2021, Volume 16, April 19, 2021.

10/26/2023

Sparse PCA: A New Scalable Estimator Based On Integer Programming

Kayhan Behdin and Rahul Mazumder, Sparse PCA: A New Scalable Estimator Based On Integer Programming, arXiv:2109.11142v2, 2021. (Julia ahd Gurobi code)

We consider the Sparse Principal Component Analysis (SPCA) problem under the well-known spiked covariance model. Recent work has shown that the SPCA problem can be reformulated as a Mixed Integer Program (MIP) and can be solved to global optimality, leading to estimators that are known to enjoy optimal statistical properties. However, current MIP algorithms for SPCA are unable to scale beyond instances with a thousand features or so. In this paper, we propose a new estimator for SPCA which can be formulated as a MIP. Different from earlier work, we make use of the underlying spiked covariance model and properties of the multivariate Gaussian distribution to arrive at our estimator. We establish statistical guarantees for our proposed estimator in terms of estimation error and support recovery. We propose a custom algorithm to solve the MIP which is significantly more scalable than off-the-shelf solvers; and demonstrate that our approach can be much more computationally attractive compared to earlier exact MIP-based approaches for the SPCA problem. Our numerical experiments on synthetic and real datasets show that our algorithms can address problems with up to 20000 features in minutes; and generally result in favorable statistical properties compared to existing popular approaches for SPCA.

11/14/2022

Global Optimization via Optimal Decision Trees

Dimitris Bertsimas and Berk Öztürk, Global Optimization via Optimal Decision Trees, arXiv:2202.06017. (Code in Julia)

The global optimization literature places large emphasis on reducing intractable optimization problems into more tractable structured optimization forms. In order to achieve this goal, many existing methods are restricted to optimization over explicit constraints and objectives that use a subset of possible mathematical primitives. These are limiting in real-world contexts where more general explicit and black box constraints appear. Leveraging the dramatic speed improvements in mixed-integer optimization (MIO) and recent research in machine learning, we propose a new method to learn MIO-compatible approximations of global optimization problems using optimal decision trees with hyperplanes (OCT-Hs). This constraint learning approach only requires a bounded variable domain, and can address both explicit and inexplicit constraints. We solve the MIO approximation efficiently to find a near-optimal, near-feasible solution to the global optimization problem. We further improve the solution using a series of projected gradient descent iterations. We test the method on a number of numerical benchmarks from the literature as well as real-world design problems, demonstrating its promise in finding global optima efficiently.

9/01/2022

出版論文驚魂記

因為決策樹和最佳化的進展,所以想要把它用來解決生產製造的問題。

花了一年多、寫了數千行的程式,萃取出成功的部分結果,投稿到第一個期刊,被退稿。

3/14/2022

Global and Robust Optimization for Engineering Design

 Berk Öztürk, Global and Robust Optimization for Engineering Design, Ph.D. Thesis, MIT, 2022. (thesis, code, talk)

There is a need to adapt and improve conceptual design methods through better optimization, in order to address the challenge of designing future engineered systems. Aerospace design problems are tightly-coupled optimization problems, and require all-at-once solution methods for design consensus and global optimality. Although the literature on design optimization has been growing, it has generally focused on the use of gradient-based and heuristic methods, which are limited to local and low-dimensional optimization respectively. There are significant benefits to leveraging structured mathematical optimization instead. Mathematical optimization provides guarantees of solution quality, and is fast, scalable, and compatible with using physics-based models in design. More importantly perhaps, there has been a wave of research in optimization and machine learning that provides new opportunities to improve the engineering design process. This thesis capitalizes on two such opportunities.

7/01/2021

Algorithms for Decision Making

Mykel Kochenderfer, Tim Wheeler, and Kyle Wray, Algorithms for Decision Making, MIT Press, 2022. (pdf available online)

A broad introduction to algorithms for optimal decision making under uncertainty. We cover a wide variety of topics related to decision making, introducing the underlying mathematical problem formulations and the algorithms for solving them. Figures, examples, and exercises are provided to convey the intuition behind the various approaches. This text is intended for advanced undergraduates and graduate students as well as professionals. The book requires some mathematical maturity and assumes prior exposure to multivariable calculus, linear algebra, and probability concepts. Some review material is provided in the appendix. Disciplines where the book would be especially useful include mathematics, statistics, computer science, aerospace, electrical engineering, and operations research. Fundamental to this textbook are the algorithms, which are all implemented in the Julia programming language.

Prof. Kochenderfer also teaches 2 related courses at Stanford. 

5/18/2021

EE104/CME107 Introduction to Machine Learning

Sanjay Lall, EE104/CME107: Introduction to Machine Learning, Stanford University, Spring Quarter, 2021. (quarter: 10 weeks, Julia, YouTube)

Introduction to machine learning. Formulation of supervised and unsupervised learning problems. Regression and classification. Data standardization and feature engineering. Loss function selection and its effect on learning. Regularization and its role in controlling complexity. Validation and overfitting. Robustness to outliers. Simple numerical implementation. Experiments on data from a wide variety of engineering and other disciplines.

共同作者為 S. Boyd兩位教授的寫作非常清楚推薦 

5/03/2021

Machine Learning Under a Modern Optimization Lens

Dimitris Bertsimas and Jack Dunn, Machine Learning Under a Modern Optimization Lens, Dynamic Ideas LLC, 2019.
The book provides an original treatment of machine learning (ML) using convex, robust and mixed integer optimization that leads to solutions to central ML problems at large scale that can be found in seconds/minutes, can be certified to be optimal in minutes/hours, and outperform classical heuristic approaches in out-of-sample experiments.

4/11/2020

Algorithms for Optimization by Kochenderfer and Wheeler

Mykel J. Kochenderfer and Tim A. Wheeler, Algorithms for Optimization, The MIT Press, 2019.
This book offers a comprehensive introduction to optimization with a focus on practical algorithms. The book approaches optimization from an engineering perspective, where the objective is to design a system that optimizes a set of metrics subject to constraints. Readers will learn about computational approaches for a range of challenges, including searching high-dimensional spaces, handling problems where there are multiple competing objectives, and accommodating uncertainty in the metrics. Figures, examples, and exercises convey the intuition behind the mathematical approaches. The text provides concrete implementations in the Julia programming language
Topics covered include derivatives and their generalization to multiple dimensions; local descent and first- and second-order methods that inform local descent; stochastic methods, which introduce randomness into the optimization process; linear constrained optimization, when both the objective function and the constraints are linear; surrogate models, probabilistic surrogate models, and using probabilistic surrogate models to guide optimization; optimization under uncertainty; uncertainty propagation; expression optimization; and multidisciplinary design optimization. Appendixes offer an introduction to the Julia language, test functions for evaluating algorithm performance, and mathematical concepts used in the derivation and analysis of the optimization methods discussed in the text. The book can be used by advanced undergraduates and graduate students in mathematics, statistics, computer science, any engineering field, (including electrical engineering and aerospace engineering), and operations research, and as a reference for professionals.
Course website and slides.

8/26/2014

矩陣方法與應用

Prof. S. Boyd 的寫作非常清楚,這一季 (quarter) 開了一門大學部新課 EE103 Introduction to Matrix Methods (註 1),精彩可期。網頁列了一些線性代數的應用
Audio
Handwritten digit classification
Image processing
Ballistics
Population dynamics
Regression
Control
Dynamic system estimation
Portfolio optimization
Tomography
Time series
Document analysis
(註 1) Stanford University 是季制,每次上課 10 週。可能因此沒時間講授重要的特徵值 (eigenvalue) 和其應用。