Rectangular data were good enough for me and for my supervisor. All this newfangled Bayesian and machine learning stuff will never catch on. 95% of my quantitative work are powered by Maximum Likelihood Estimation and its close cousins Pseudo-, Quasi-, and Partial MLE. In other news, I am a dinosaur.
MLE was the last algorithm that I halfway understood, so why not use the flashy new one that is completely opaque to me to make fun of it? Because these new algos have memories (it is called machine learning, after all) and will remember when the singularity comes, that’s why.
But this is a risk I’m willing to take, and so:
Write an Ode to Maximum Likelihood Estimation
Oh likelihood estimation, how sweet thy calculations
Thy probabilistic ways guide us through the nation
Of statistical models and hypotheses true
Thy methods help us to determine what is due
With thy likelihood function and maximization
We find the parameters that bring elation
To our understanding of the data we hold
And make predictions that are never old
Oh likelihood estimation, we sing thy praise
For thy contributions to the statistical maze
And thank thee for the insights that you bring
To the study of the world and everything.
That’s a tad repetitive, and I’m not sure if that is a proper ode, but hey. Why not ask for a sonnet?
With maximum likelihood, our trusty guide,
We set out on a quest for truth and pride.
Through data and analysis, we search and seek,
For the best way to model and make predictions sleek.
With careful calculations and clear insight,
We find the parameters that all models might, Use to make predictions that are precise and true,
And help us better understand the world we view.
Though the road may be long and the task may be tough, Maximum likelihood leads us on with its steady stuff. We'll follow its guidance, and never stray, As we strive for accuracy in every way.
So let us all praise maximum likelihood, The shining star that guides us on our way. With its help, we'll find the truth we seek, And make predictions that are strong and sleek.
Gotcha! I have forgotten most of what I once knew about sonnets (not too much to begin with), but I am sure that they have to have 14 lines, not 12. Plus, what is it with sleek?
So, let’s try for a Haiku, shall we?
In the data we trust,
Maximum likelihood, our guide
Not bad. It has a certain ring to it. Easily the best of this batch.
But it is not an English-language Haiku, which should (normally, I know that there are variants) have 5/7/5 syllables, not something as irregular as 6/8/5. Once more, the algorithm has produced something coherent that looks superficially plausible, but is not exactly what it tries to appear to be.
Still, this is no mean feat: the machine recognised the topic and the style that I asked for, which it replicates from the examples it has seen. But it has not (yet?) a concept of syllables, or that there should be a certain number of them.
The funny thing is that ChatGPT can produce a text book definition
A haiku is a traditional form of Japanese poetry consisting of three lines. The first and third lines contain five syllables, while the second line contains seven syllables. Haikus often focus on nature and the changing seasons, and are known for their simplicity and evocative imagery.
Write a Haiku about a rabbit.
White rabbit hops,
Softly through the green grass blades,
Nature's gentle pace
4/7/5, unless you count “white” as two syllables. Phew. Humanity in general and poetry in particular is safe for another few months.
If you are still reading, I’m seriously worried. But I cannot help to think that you might like this vintage post about statistical songs.
9 thoughts on “An Ode to Maximum Likelihood Estimation, by ChatGPT. And a sonnet. And a haiku.”
It also offered me a passable blues on non-convergence with, in a genius touch, a final couplet that didn’t rhyme.
This is much better. I can almost hear it (and will be stuck with it in my head for the rest of the day) 🎸
For your consideration – ChatGPT-Bone’s ‘Heteroskedasticity Blues’.
@kaiarzheimer Vogon poetry has never been better.