Jiashun Jin
Professor
Carnegie Mellon University
http://www.stat.cmu.edu/people/faculty/jiashun
Jiashun Jin is Professor in Statistics and Affiliated Professor in Machine Learning at Carnegie Mellon University. His expertise is in statistical inference for Rare and Weak signals in Big Data. His earlier work was on large-scale multiple testing, focusing on the development of (Tukey's) Higher Criticism and practical False Discovery Rate (FDR) controlling methods. His more recent interest is on the analysis of social networks and text mining. Jin is an elected IMS fellow and an elected ASA fellow. He has also delivered the highly selective IMS Medallion Lecture in 2015 and IMS AoAS (Annals of Applied Statistics) Lecture in 2016. Jin has co-authored two Editor's Invited Review papers and two Editor's Invited Discussion papers. He has served as Associate Editor for several statistical journals including Annals of Statistics and JASA, and has also gained valuable experience in financial industry by doing research for two years at Two-Sigma Investment from 2016 to 2017.
Title: Analysis of Social Networks
In recent years, analysis of social networks has attracted a lot of attention. In this lecture, we discuss several problems on network analysis, including global testing, community detection, mixed-memberships, and topic estimation. We introduce some recent methods including the Signed Polygon test, SCORE,Mixed-SCORE, and Topic SCORE. We also discuss a recent data set we collect and clean. The data set is based on all published papers in 36 journals in statistics, probability, and machine learning, spanning 40 years, and for each paper, the data set contains the title, authors and affiliations, abstract, MSC numbers,keywords, references, and citation counts of the paper.
Yuhong Yang
Professor
School of Statistics, University of Minnesota
https://cla.umn.edu/about/directory/profile/yangx374
Yuhong Yang received his Ph.D.degree in statistics from Yale University in 1996. He first joined the Department of Statisticsat Iowa State University, then moved to theUniversity of Minnesota in 2004, where he has been a full professor since 2007. His research interests include model selection, multi-armed bandit problems, forecasting,high-dimensional data analysis, and machine learning. He has published in journals in several fields, including Annals of Statistics,IEEE Transactions on Information Theory, Journal of Econometrics, Journal of Approximation Theory, Journal of Machine Learning Research, and International Journal of Forecasting. He is a fellow of the Institute of Mathematical Statistics.
Title: Optimal Model Selection and Model Combination
This lecture will center on the understanding of optimalities of modelselection/combination methods in regression learning. First, we will review selection consistency, asymptotic efficiency and minimax-rate optimailites when different models are considered, and examine several important questions,including which penalty is the best (in what sense) and is it feasible to achieve multiple optimalities in a single model selection criterion. Second, we investigate the cross-validation (CV) paradox, clear on some misconceptions related to CV, and present on correct use of cross-validation in relation to the goal of the learning. Third, we study model selection diagnostics (such asinstability measures, variable selection deviation, estimated F-measure,variable importance) to address the problem of irreproducibility of statistical findings due to model selection uncertainty. Lastly, we will learn the basics of the aggregation theory to combine the strengths of different learning methods for high-dimensional data.
Yingying Li
Associate Professor
Department of ISOM and Department of Finance ,
Hong Kong University of Science and Technology
http://www.bm.ust.hk/isom/faculty-and-staff/directory/yyli
Dr.Li received her BSc in Mathematics from Beijing Normal University, and Ph. D inStatistics from the University of Chicago. Dr. Li’s early research focused on high-frequency volatility estimation. The pre-averaging approach that she proposed with her co-authors had become one of the most widely used estimatorsin the field. She also looked closely into market microstructure noise and studied its fine properties, including its auto correlation structure and relationship with trading information. More recently, her research has been more on large portfolio risk analysis and asset allocation. In one of her latest research, a novel approach is proposed which can asymptotically achieve mean-variance efficiency for large portfolios.
Before joining HKUST, Dr Li also held positions as lecturer and postdoctoral fellow at the Bendheim Center for Finance and the Operations Research and FinancialEngineering department at Princeton University.
She is an elected fellow of the Society for Financial Econometrics (SoFiE). She serves on the editorial boards of Journal of Econometrics, Journal of Business& Economic Statistics and Journal of Financial Econometrics.
Title: Statistical Learning for Investments
In this short course, I plan to introduce two main topics on statistical learning for investments:
(1)optimal portfolio allocation on a large number of assets.
(2)personalized wealth management.
The course is based on a series of her recent research papers.
Weijie Su
Assistant Professor
Department of Statistics The Wharton School ,University of Pennsylvania
https://statistics.wharton.upenn.edu/profile/suw/
Weijie Su is an Assistant Professor of Statistics at the Wharton School, University of Pennsylvania. Prior to joining Penn, he received his Ph.D. in Statistics from Stanford University in 2016 and his B.S. in Mathematics from Peking University in 2011. Su's research interests include high-dimensional inference, multiple testing, statistical aspects of optimization, and private data analysis. He is a recipient of an NSF CAREER Award in 2019.
Title: Recent Advances in False Discovery Rate Control in High Dimensions
We introduce a new method named SLOPE to control the FDR in sparsehigh-dimensional linear regression. This computationally efficient procedure works by regularizing the fitted coefficients according to their ranks: the higher the rank, the larger the penalty. This is analogous to the Benjamini-Hochberg procedure, which compares more significant p-values with more stringent thresholds. Whenever the columns of the design matrix are not strongly correlated, we show empirically that SLOPE obtains FDR control at a reasonable level while offering substantial power. Although SLOPE is developed from a multiple testing viewpoint, we show the surprising result that it achieves optimal squared errors under Gaussian random designs over a wide range of sparsity classes. Next, we use the approximate message passing to analyze the asymptotic behaviors of SLOPE and observe several striking phenomena that are not observed in the lasso.
Sijian Wang
Associate Professor
Department of Statistics and Biostatistics, Rutgers The State University of New Jersey
https://stat.rutgers.edu/people-pages/faculty/people/130-faculty/381-sijan-wang
Dr. Sijian Wang has a Ph.D. in Biostatistics from The University of Michigan and a B.S. in Mathematics from Tsinghua University, China. He is currently an associate professor in the Department of Statistics at Rutgers University. His research interests include precision medicine, high-dimensional data analysis, proteomics, cancer genomics, bioinformatics, survival analysis, longitudinal data analysis and statistical modeling in science and engineering. He has published many papers in top statistical journals as well as high-impact scientific journals including Nature and PNAS.
Title: Estimation and Inference of Heterogeneous Treatment Effects