Showing posts with label economics. Show all posts
Showing posts with label economics. Show all posts

Friday, May 16, 2025

A Security Trilemma

Playing around with writing malware proof-of-concepts, running red and blue team simulations in my computer lab against Windows Home edition, I feel sort of bad for Windows Home users.

Such users probably constitute the majority of Microsoft's userbase. And most security mitigations for that edition are not exactly effective against attackers.

Commercial-grade versions of Windows and commercial-grade security products are a different story in some circumstances. Commercial editions of Windows include a lot of nice mitigations and security features. But I think it's kind of an economic trilemma.

You have three potential strategies for security--and a few different potential tradeoffs. You can only optimize for two out of three.

  • If it's cheap and convenient, it won't be secure.
  • If it's cheap and secure, it won't be convenient.
  • If it's secure and convenient, it won't be cheap.

There are certainly exceptions to this model, though. For example, think about open-source, end-to-end encrypted messaging apps. Some of those feel like very unlikely tail distributions, where, to some extent, the solutions provide all of the above: they're cheap, secure, and convenient.

Sunday, March 30, 2025

Too much efficiency makes everything worse

From "Overfitting and the strong version of Goodhart's law":
Increased efficiency can sometimes, counterintuitively, lead to worse outcomes. This is true almost everywhere. We will name this phenomenon the strong version of Goodhart's law. As one example, more efficient centralized tracking of student progress by standardized testing seems like such a good idea that well-intentioned laws mandate it. However, testing also incentivizes schools to focus more on teaching students to test well, and less on teaching broadly useful skills. As a result, it can cause overall educational outcomes to become worse. Similar examples abound, in politics, economics, health, science, and many other fields.

[...] This same counterintuitive relationship between efficiency and outcome occurs in machine learning, where it is called overfitting. [...] If we keep on optimizing the proxy objective, even after our goal stops improving, something more worrying happens. The goal often starts getting worse, even as our proxy objective continues to improve. Not just a little bit worse either — often the goal will diverge towards infinity.

This is an extremely general phenomenon in machine learning. It mostly doesn’t matter what our goal and proxy are, or what model architecture we use. If we are very efficient at optimizing a proxy, then we make the thing it is a proxy for grow worse.

Sunday, March 16, 2025

Patterns and the Stock Market

On the random walk hypothesis and post-hoc explanations for describing natural processes, from "Patterns and the Stock Market":

While it's certainly entertaining to spin post-hoc explanations of market activity, it's also utterly futile. The market, after all, is a classic example of a "random walk," since the past movement of any particular stock cannot be used to predict its future movement. This inherent randomness was first proposed by the economist Eugene Fama, in the early 1960's. Fama looked at decades of stock market data in order to prove that no amount of rational analysis or knowledge (unless it was illicit insider information) could help you figure out what would happen next. All of the esoteric tools and elaborate theories used by investors to make sense of the market were pure nonsense. Wall Street was like a slot machine.

Alas, the human mind can't resist the allure of explanations, even if they make no sense. We're so eager to find correlations and causation that, when confronted with an inherently stochastic process - like the DJIA, or a slot machine - we invent factors to fixate on. The end result is a blinkered sort of overconfidence, in which we're convinced we've solved a system that has no solution.

"Is an economic recession a divergence from the market trend that eventually reverses over time? Or is it more analogous to a random walk?" asked the student.

The master struck him on the head with a walking stick. "Economic recessions are primarily qualitative; attempting to measure them is meaningless."

Quantitative fundamentals play a role in shaping market dynamics, but in a Bayesian spirit, so too does information and discourse circulating within those markets. "It's priced in," is an expression people use to describe the way that the market's dynamic is not just a sum of monetary fundamentals but also a sum of qualitative sentiment, which is far more difficult to quantify.

Sure, you could try to measure a market recession with GDP output, but if you want an even better attempt at understanding one, you may have greater success communicating with actual market participants—but even then, perspectives will be wildly subjective.

Recessions aside—one could also attempt, at any time, to use technical analysis to predict short-term stock prices. But you may also simply end up straining at gnats.

Furthermore, if someone is claiming to know the true reasons why the market went up or down, there's a significant possibility they are lying. Either by not knowing any better, or deliberately lying.

If they actually knew—if they actually possessed that knowledge—they would have used it to make money.

Using Python To Access archive.today, July 2025

It seems like a lot of the previous software wrappers to interact with archive.today (and archive.is, archive.ph, etc) via the command-line ...