There's an old joke in the recruiting world - "first thing I do when I get a stack of resumes is to shuffle them up and throw half of them in the trash. I don't want any unlucky people working at this company."

Picking candidates to interview is a hard problem, and it's why you see a lot of investment in referral programs.

It's tricky because there's not a ton of signals you can get before interviewing someone, and interviewing is expensive. A single hire can easily cost $10k in recruiting labor time to source, and it's not really practical to ask candidates to spend time proving themselves before they're even offered an interview.

In my time as an interviewer for a couple different tech companies, I'd say only about 20% got hired at most, and a lot of that 80% was people who had impressive looking resumes but very little positive skill signals in the interview.

Those suffer even harder from the problems AI generally faces.

HIPPA and its ilk make mass farming detailed sensitive medical data very difficult (thank goodness!)

AI suffers from biases when either over- or under-trained. An ML engineer has to go through great lengths to teach a model to focus on actual medical signals instead of just slapping "healthy" on young people and "at risk" for old people.

A big revolution of AI in medicine has been "just around the corner" far longer than this current wave of interest in AI, but it still hasn't progressed past leaky preliminary screening (which it's very well suited to do).

It's definitely something to watch out for and be excited about, but we should remain very aware of the nuances of using AI in medicine. Using LLMs to generate patient summaries for doctors is great, using AI models to perform diagnoses is one of those things that's been 5 years away for 15 years now.

Depends. It's much less important than experience and skill. It might get your foot in the door for an interview.

It's much more important for cold networking, but even then there's a lot of companies that are highly respected in the space they work in even if they're nowhere near Fortune 500.

I've certainly had a lot of professional contact from working at Google, but I'd still say more than half of the interest I get on my professional channels comes either from my individual work or from peers in the industry I know personally.

It's all moot if you don't focus on your skills first though, you can put whatever you want on your resume but if you can't show up and make yourself demonstrably useful you won't have the kind of career progression you're looking for.

I'm filling this under "math pranks" right next to the whole trick to "prove" 1=0 by dividing by zero.

It's a clever trick - addition and multiplication are associative, and humans are really good at finding the lazy shortcut to solve a problem. The 2s are much easier to deal with than the 8, division is usually isolated pretty well in math communication so that the whole left-to-right rule isn't at the top of mind.

This is the math form of the prank where you trick someone into saying they eat soup with a fork by asking them to list similar words first. Good stupid fun designed to make people feel dumb but not in any useful way.

NGL I love the ZMI altar for exactly this. It's not even remotely meta for getting high level runes, but it's a nice little thing if you really just want to make your own at 10 RC.

Do you need it? Likely not. Is it used? Nearly constantly by at least someone. Is it helpful to you? Absolutely. Is it worth filling up that many classes to study? That's debatable.

I only rarely use algebra, calculus, and statistics, but when a problem comes up that calls for them I'm very glad I paid attention in those classes.

There's also something to be said for knowing how to approach abstract problems in different ways that some formal math education can bring, especially around calculus and statistics that show us just how bad human intuition is around certain problems.

Ctrl+left/right and using the home/end keys gets you 90% of the way to super fast, VIM bindings can get you the other 10%.

Ctrl-click is pretty handy in IDEs that have a "jump to definition" too.

Sure, they don't overlap in a lot of ways... But I'd argue that's a good thing. You learn new ideas, new ways of approaching problems. You can use C++ in the browser but I'd not suggest trying until you really understand the fundamentals of both.

That's more or less my professional specialty, web dev and C++.

Yep, duodecimal is what I want. "Put two extra numbers between 9 and 10" is the best way I've found to describe that to people who aren't already familiar with duodecimal, and I this sub has a mix of math versed and not so math versed visitors.

There might be, but WebGPU just hit stability earlier this year and still isn't fully supported in major browsers. It takes time to catch up.

I think the major web graphics libraries for JavaScript (Pixi, Three, Babylon) all support it.

For cross platform native/web stuff with WASM targets, you might find some existing renderers.

If you don't absolutely need WebGPU (and you probably don't), you can look at Google Filament which sounds more or less like what you want.

Yep! If you don't know the gender pronoun to use or want it to be unspecified, "they" is a long standing pronoun to use in English...

... Which is sort of exactly the point in using them as gender identity pronouns. It's a natural fit.

For me it wasn't really tricky to figure that part out, the hard part was un-learning the implicit gendering of people. Subconsciously assigning "he/she" to everyone on first meeting. That part is really hard to un-learn for those of us who grew up with a rigid social gender binary. Worth the effort, but still hard.

Others have pointed out it's not by much, but it's enough to catastrophically impact engineering. Let's not change the meter.

I propose: Metric 2, Electric Boogaloo. This is not serious, don't take it seriously.

  • Unit of length is a clean fraction of the speed of light. 1/300m is fine, so people don't need a new intuition.

  • Unit of temperature is defined in the FUNDAMENTAL specific heat of water instead of phase changes at sea level. No more of this 1 Joule does not equal 1 Calorie crap.

  • Unit of temperature starts at 0. Kelvin got this right. Celsius did not.

  • Units of time have base 10 conversions as well. How many seconds are in an hour, off the top of your head? Should be an easy question. Day/year don't need clean conversions, they fundamentally aren't aligned, nor are they constant.

  • Align day and year by moving earth further from the sun. Should help a bit with global warming as a side benefit.

  • Add two extra numbers between 9 and 10. I like dividing things by 3 sometimes. Give each human two extra fingers to help it make sense.

Arithmetic, linear algebra, and calculus are common in computer graphics, AI, and simulating physical and models. I'll warn here that graphics shows up more than just "making video games", but unless you're interested in those fields I wouldn't sweat it.

Graph theory and combinatorics comes in _extremely_ handy nearly daily if you're dealing with distributed systems, highly concurrent systems, algorithm analysis, or domains that work with networks (anything with a "share" button). Not many CS jobs need you to care about that, but a lot of the best CS jobs do.

Beyond that eh, it's really helpful to be good at math but not necessary. I joke with my product team that if it involves any number that isn't 0, 1, or 2, I'll pull it from a config value and let them sort it out.

Hmm.... I'd be surprised if main thread stalling actually stalled the background threads, but not as surprised as I've been by all sorts of the nonsense that I've run into with WASM multithreaded builds.

WebGL is great, and I believe you're correct that the browser application thread is distinct from the browser render thread. If it's multithreaded rendering you're after, WebGPU doesn't do you any better yet because it doesn't support sending command lists between threads anyways (or at least didn't last I checked a few months ago). The driver overhead is a lot lower, but if you're not draw constrained anyways (which instancing and texture arrays buy you) it's not critical. WebGPU is easier to debug and has less legacy API cruft, that's the end of my pitch. I've used both for a while (a couple years before WebGPU even launched in browsers) and I still use GL day to day for all but greenfield projects.

Both flavors of parallelism should work great to my knowledge, I wish you the best of luck!

Oh! Also, for the record, GPT was spot on about the server headers. I'm a skeptic of using GPT for anything even tangentially security related, but this time it nailed it. There's a really clever category of attacks that could allow an extremely sophisticated attacker to read memory from another browser tab using SharedArrayBuffer, the thing required for sharing memory between threads in WASM, those headers turn on some additional isolation that mitigates that risk.

Thanks for posting this! Browser multithreading is shamefully underutilized, and this kind of thing is a massive barrier to entry. We need good knowledge dumps like this.

I haven't looked at multithreaded Rust in the browser in years, are blocking primitives in the main thread still no-ops or is that using the spin-wait stuff Emscripten provides? There's some weird "gotchas" either way, last I saw someone playing with multithreaded Rust code there was some unreliable behavior around synchronization on the main thread, the C++ ecosystem fixes it but in a way that can deadlock on a busy wait instead.

Also heads up if it's using Emscripten there's a subtle memory leak when it comes to cleaning up your modules, for most apps it's not really an issue and the fix has been identified but I haven't got around to actually fixing the damn thing.

Best financial advice I ever got wasn't really advice per se, it was "you can't afford to go to school in California."

I moved somewhere with a lower cost of living to get my education and start my career, absolutely no regrets. That decision had more positive impact on my financial well being a decade later than any other individual decision I made, including moving back to California for the lucrative tech work.

Oh no... Please tell me you at least don't use quarts and pints and all that though? Or that the cups are defined in some sane number of cm3 or something?

Do we have to wait before moving our kitchen measurements over? Feet and inches are tolerable evils day to day but tablespoons and cups need to go.

Liters is a no brainer, we use that for soda anyways.

Kilograms and kilometers really aren't that hard, meters is a bit weird but really not bad. I'm in whenever y'all want to change.

I'll die on my Fahrenheit hill though. I'll move once the metric unit for temperature plays nicely with the other units in a reasonable way, right now it's every bit as arbitrary and stupid as the freedom one I'm used to.

Oh cool! What domain? I've never seen one but I'm sure they're out there

Oh cool! What domain? I've never seen one but I'm sure they're out there

I've never seen a pure functional language like Haskell used in industry, but most modern languages have some functional components and there's some hugely useful ideas you can learn from functional programming.

Code written using functional programming ideas tends to be relatively easy to test, easy to maintain, and less prone to bugs. Even if all you take out of it is "unnecessary state is bad, inversion of control good" you're writing better code.

Java, JavaScript, Rust, Python, and even C++ nowadays have the basic primitives of functional programming. I would definitely take the class if I were you.

There's some style guides out there, Google's is reasonable if you're looking for some inspiration but even they're careful to point out that a lot of their choices are arbitrary and only picked to have consistency.

I like camelCase by default, UpperCamelCase for class names, camelCase_ with an underscore suffix for private member variables, kUpperCamelCase with a k prefix for named constants, and LOUD_SNAKE_CASE for macros / preprocessor defines.

I have no rhyme or reason to any of it other than those are things that are useful for me to identify at a glance occasionally.

We figured out how to make a machine do math. Neat. What's this? We can make that machine all electric and super small. Cool.

We figured out how to describe images and sounds with only numbers. Holy moly. We make more machines that know how to turn those numbers back into colors and sound.

Hey, you know what's good for doing things with numbers? Math! That gives me an idea - plug the math machine into the numbers to pictures machine.

Aaaaaand that's how a GUI works, in a nutshell. Drivers, firmware, HDMI cables, they're all "just" things to help along the way of turning numbers back into pictures and sounds.

I've never had so much as a second glance at my year long gap for calling it a "gap year" and mentioning that I was doing some passion projects in the meantime.

YMMV

I think that does a great job of putting things into scale.

I'll add on that the accumulation of wealth is something that took a long time, that's $13k per person that took decades to reach even in fast cases like the American tech billionaires, much more so for old money.

If you were to annualize that, we're probably talking numbers in the hundreds of dollars a year.

Which is another interesting way to frame this - if you squint really hard and ignore a lot of important factors, we're talking about a "cost" per person of a few hundred a year to maintain the hyper-rich.

Not enough to solve the world's problems, but definitely enough to pay for a nice anniversary dinner and weekend vacation that I'd like to have please. I wager the properly poor in developing nations would like it even more.