• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Deep Mukhopadhyay, Ph.D.

  • Home
  • Blog
  • Research
  • Publications
  • Software
  • Teaching
    • Courses
  • Talks

21st-century statistics

The "Science" and "Management" of Data Analysis

March 5, 2017 by deepstatorg

Hierarchy and branches of Statistical Science

The phrases “Science” and “Management” of data analysis were introduced by Manny Parzen (2001) while discussing Leo Breiman’s Paper on “Statistical Modeling: The Two Cultures,” where he pointed out: “Management seeks profit, practical answers (predictions) useful for decision making in the short run. Science seeks truth, fundamental knowledge about nature which provides understanding and control in the long run.” Management = Algorithm, prediction and inference is undoubtedly the most useful and “sexy” part of Statistics. Over the past two decades, there have been tremendous advancements made in this front, leading to a growing number of literature and excellent textbooks like Hastie, Tibshirani, and Friedman (2009) and more recently Efron and Hastie (2016). Nevertheless, we surely all agree that algorithms do not arise in a vacuum and our job as a Statistical scientist should be better than just finding another “gut” algorithm. It has long been observed that elegant statistical learning methods can be often derived from something more fundamental. This forces us to think about the guiding principles for designing (wholesale) algorithms. The “Science” of data analysis = Algorithm discovery engine (Algorithm of Algorithms). Finding such a consistent framework of Statistical Science (from which one might be able to systematically derive a wide range of working algorithms) promises to not be trivial. Above all, I strongly believe the time has come to switch our focus from “management” to the heart of the matter: how can we create an inclusive and coherent framework of data analysis (to accelerate the innovation of new versatile algorithms)–“A place for everything, and everything in its place”– encoding the fundamental laws of numbers. In this (difficult yet rewarding) journey, we have to remind ourselves constantly the enlightening piece of advice from Murray Gell-Mann (2005): “We have to get rid of the idea that careful study of a problem in some NARROW range of issues is the only kind of work to be taken seriously, while INTEGRATIVE thinking is relegated to cocktail party conversation”  ]]>

Filed Under: Blog Tagged With: 21st-century statistics, Core of Data Analysis, Science of Statistics

Confirmatory Culture: Time To Reform or Conform?

November 1, 2016 by deepstatorg

THEORY

Culture 1: Algorithm + Theory: the role of theory is to justify or confirm. Culture 2: Theory + Algorithm: From confirmatory to constructive theory, explaining the statistical origin of the algorithm(s)–an explanation of where they came from. Culture 2 views “Algorithms” as the derived product, not the fundamental starting point [this point of view separates statistical science from machine learning].

PRACTICE 

Culture 1: Science + Data: Job of a Statistician is to confirm scientific guesses. Thus, happily play in everyone’s backyard as a confirmatist. Culture 2: Data + Science: Exploratory nonparametric attitude. Plays in the front-yard as the key player in order to guide scientists to ask the “right question”.

TEACHING 

Culture 1: It proceeds in the following sequences: for (i in 1:B) { Teach Algorithm-i; Teach Inference-i; Teach Computation-i } By construction, it requires extensive bookkeeping and memorization of a long list of disconnected algorithms. Culture 2: The pedagogical efforts emphasize the underlying fundamental principles and statistical logic whose consequences are algorithms. This “short-cut” approach substantially accelerates the learning by making it less mechanical and intimidating. Should we continue to conform to the confirmatory culture or It’s time to reform? The choice is ours and the consequences are ours as well.]]>

Filed Under: Blog Tagged With: 21st-century statistics, Data Science, Next-Generation Statisticians, Science of Statistics

The Scientific Core of Data Analysis

November 26, 2015 by deepstatorg

Richard Courant‘s view: “However, the difficulty that challenges the inventive skill of the applied mathematician is to find suitable coordinate functions.” He also noted that “If these functions are chosen without proper regard for the individuality of the problem the task of computation will become hopeless.” This leads me to the following conjecture: Efficient nonparametric data transformation or representation scheme is the basis for almost all successful learning algorithms–the Scientific Core of Data Analysis–that should be emphasized in research, teaching, and practice of 21st century Statistical Science to develop a systematic and unified theory of data analysis (Foundation of data science).]]>

Filed Under: Blog Tagged With: 21st-century statistics, Core of Data Analysis, Data Science, Next-Generation Statisticians

Two Kinds of Mathematical Statisticians: Connectionist and Confirmatist

June 10, 2015 by deepstatorg

Connectionist: Mathematicians who invent and connect novel algorithms based on new fundamental ideas that address real data modeling problems. Confirmatist: Mathematicians who prove why an existing algorithm works under certain sets of assumptions/conditions (post-mortem report). Albeit, the theoreticians of the first kind (few examples: Karl Pearson, Jerzy Neyman, Harold Hotelling, Charles Stein, Emanuel Parzen, Clive Granger)  are much more rare than the second one. The current culture has failed to distinguish between these two types (which are very different in their style and motivation) and has put excessive importance on the second culture – this has created  an imbalance and often gives a wrong impression of what “Theory” means. We need to discover new theoretical tools that not only prove why the already invented algorithms work (confirmatory check) but also provide the insights into how to invent and connect novel algorithms for effective data analysis – 21st-century statistics.]]>

Filed Under: Blog Tagged With: 21st-century statistics, Confirmatory Theory, Exploratory Theory, Next-Generation Statisticians

Impact: The way I see it

May 16, 2015 by deepstatorg

Theoretical beauty  x  Practical utility  =  Impact of your work.

  • By Theoretical Beauty, I mean the ability/capacity of “Unification” of any concept/idea. (not proving consistency or rate of convergence).
  • Practical utility denotes the generic usefulness of the algorithm (simultaneously applicable for many problems) – Wholesale algorithms. (not just writing R-packages and coding).
  • The goal is to ensure that none of the quantities in the LHS of the equation are close to ZERO. Perfect balance is required to maximize the impact (which is an art).
]]>

Filed Under: Blog Tagged With: 21st-century statistics, impact, Next-Generation Statisticians

  • Page 1
  • Page 2
  • Go to Next Page »

Primary Sidebar

Deep Mukhopadhyay

Deep Mukhopadhyay
Statistics Department
deep [at] unitedstatalgo.com

EDUCATION

  • Ph.D. (2013), Texas A&M University
  • M.S. (2008), Indian Institute of Technology (IIT), Kanpur
  • B.S. (2006), University of Calcutta, India

Footer

Follow Us

  • LinkedIn
  • Twitter
  • Skype

Contact Us

  • Email
    deep@unitedstatalgo.com
  • Address
    Department of Statistics
    Sequoia Hall, 390 Serra Mall
    Stanford, CA 94305

Read Recent Blogs

  • Could Einstein’s Work Get Published Today?
  • What's The Point of Doing Fundamental Science?
  • Two sides of Theoretical Data Science: Analysis and Synthesis

Copyright © 2025 · eleven40 Pro on Genesis Framework · WordPress · Log in