91

The joy of recursion, immutable data, & pure functions: Making mazes with JS

If you enjoyed that, I have a blog post on generating mazes in Haskell (from over a decade ago!). The algorithm is very similar, but the code is written using "inductive graphs" and the post is really more of an intro to working with graphs in a purely functional style.

https://jelv.is/blog/Generating-Mazes-with-Inductive-Graphs

3 days agotikhonj

Kind of amusing and maybe telling that this article about implementing an algorithm functionally begins by explaining it in an iterative, mutational fashion.

3 days agowk_end

Nice animation on the maze building algo.

I remember trying to use Immutable.js back in the day. It's probably pretty great with Typescript these days, but I remember it was kinda hell with vanilla JS back then since I'd accidentally do things like assign `thing.foo = 42` instead of `thing.set('foo', 42)` and I'd comb through the code to see why it wouldn't work, and I remember not knowing when I had a JS object or a Record. All things fixed by Typescript, of course.

3 days agohombre_fatal

It was great working with Redux, Immutable.js which was recommended by Redux, React, and Reselect managing thousands of points of time series streaming market data. The state stayed immutable, and when I needed to break cache I returned a new object for the affected slice in the root state tree. That change, changing the object reference on a leaf of the state tree, triggered Reselect to recompute its memoized selectors, and React / Redux (with shallow equality checks) picked up the new references and rerendered only the impacted components.

I used Ramda for all data transformations as data flowed through the app. I ran the code in React Native also and could count the precise number the of operations a megabyte or two of data updating every few seconds that included data visualizations needed. Nothing was hidden!

A decade ago I attached myself to Mongo, Express, Angular, Node stack and it was a disaster waste of time and nobody was hiring for that skill set. Beginning of 2018 I started with the React, Redux, Immutable, and ReSelect and it was a massive success. I got very lucky when I started that project that was the hot new JavaScript stack all the bloggers were raving about. A broken clock is correct twice a day.

3 days agodataviz1000

Once you have the maze you need to solve it. JavaScript is a great environment for tinkering. If you would like to try, I made an online maze game for myself this year.

https://hypervariety.com/Amaze/

3 days agohyperhello

The author has also written a book which I’ve read a couple of times and enjoyed.

A Skeptic’s Guide To Functional Programming With JavaScript

https://jrsinclair.com/skeptics-guide

I’d recommend it for functional-curious JS devs like myself.

3 days agojoshfarrant

I’m still sad that JS doesn’t have tail call optimization. I’ve always wondered why. Is it hard to implement?

3 days agosillysaurusx

If my memory serves right, the browser vendors couldn't agree on the implementation.

On the one hand, allowing any function to be eligible for TCO means that the developer won't know if the code was optimized or not. A trivial change can easily convert a fast function to one that blows the stack. Additionally, every function created- or at least every named function- would need to be analyzed, causing a performance hit for everyone, even those who never write a recursive function in their lives.

On the hand, some argued that TCO eligibility should be explicitly opt-in with some sort of keyword or annotation. If a function so annotated did not end up being a valid target for TCO, the script would not compile, similar to a syntax error. This is an even more harsh failure mode than the implicit version, but would have been easier to identify.

I vaguely recall chrome devs generally being in favor of explicit annotations, and Safari implicit. I could be completely wrong on this, and I don't think anyone was particularly enthused about the trade-offs either way.

3 days agozdragnar

People are too addicted to automatic stack traces for mainstream languages to optimize away stack frames.

3 days agochowells

It does if you use JavaScriptCore (Safari, other Webkit browsers, Bun).

3 days agoJtsummers

TCO is actually in the ES6 spec but browser vendors (except Safari) haven't implemented it due to concerns about debugging (stack traces become less useful) and security implications with stack inspection.

3 days agoethan_smith

Nice read, and beautiful website btw.

5 days agojefecoon

Indeed to both. I enjoyed it greatly. It was well written and on something that I think about someone's but never implemented.

3 days agotisdadd

[dead]

3 days agoprakashrj

I'm slightly horrified by the memory leak that's casually introduced without even a remark as to the potential to cause a problem. I can't tell if I'm more horrified by the cavalier attitude or the fact that JavaScript makes having a global registry the only easy way to use an object of an arbitrary type as a key to Map.

But at the very least, if you're going to memoize immutable values, please do it in a way that allows garbage collection. JavaScript has WeakRef and FinalizationRegistry. (Why it doesn't provide the obvious WeakCache built on those is a mystery, though.)

The issues won't be visible on a toy example like making mazes a few hundred elements across, but if you use these techniques on real problems, you absolutely need to cooperate with the garbage collector.

3 days agochowells

Probably better to use an LRU cache rather than weak maps.

3 days agob_e_n_t_o_n

No because then you lose the ability to compare objects for equality.

3 days agosltkr

I don't think so?

3 days agob_e_n_t_o_n

Yes. It's using a global variable to canonicalize instances so that reference equality is value equality. An LRU cache will evict things that are still in use, so that the canonicalization process returns a different instance for the same value. (If it doesn't evict anything still in use, it's strictly inferior to just tying to the garbage collector anyway.) This will break the assumption that reference equality is the same as value equality that the rest of the code depends on.

3 days agochowells

Oh yeah you're right, I wasn't thinking about the possibility of creating a new point that got evicted but is still hanging around for the comparison...

Personally I'd design it with Point.Equals(p1, p2) static method and forego using referential equality, then the LRU cache could prevent runaway memory usage but tbh this is all bikesheding for this use case anyways :). The original code is fine.