This last vacation week, I got around to doing one of the things I love - intersecting different interests. Like every designer, it seems, I’ve been wanting to update my portfolio for a while. Mainly with the work from my time at LEGO.
Another thing I’ve wanted to do for a while was to play with three.js and it’s ability to load LDraw files (LDraw is an open, community-driven standard and repository for LEGO CAD models).
So the idea formed in my head that I could build my portfolio in a more playful and interactive way, using actual CAD models of LEGO models and elements I worked on.
Loading LDraw models
Last time I tried loading LDraw models, I struggled to get custom LEGO models “packed” into a single file that would contain all its “parts” and “subparts”. As usual, that was because I didn’t read the documentation properly. Using the unofficial LDraw parts (thanks Philippe Hurbain, for submitting the little dude), I was finally able to pack LEGO Mario and get him onto the scene:
Unofficial LEGO Mario model loaded into three.js
Physics!
This made me think, it could be fun if Mario was bouncing around on the scene, accompanied by some Goompas and lightly swaying trees. First idea that came to mind was using a physics engine and define the behavior in code. Using cannon.js, I’ve made a little, buggy, prototype of LEGO elements bouncing around in a confined space:
LEGO elements bouncing around using a physics engine
Before I forget, I actually also updated my main site this week! New, slightly fresher design, dark/light mode, and backend based on hugo just like this site.
My original idea for this week, rather than the title of this post, was to write out some of the future ideas on the site. Maybe I’ll get around to it next week 🤷🏼♂️, but also I’m going back to work, and pace my side project energy as projects winds up again.
I’ve been trying to ignore Web3, and NFTs in particular, for a while. I think apes and black PNGs selling for tens or hundreds of thousands of dollars is dumb. But I’m excited to see computer artists who’ve barely been making a living for years finally achieve financial success.
A few weeks ago, I started studying web3. First by listening to the NFTs, Blockchain, Social Tokens with Sean Bonner episode on Julian Bleecker’s podcast. Then I read Sean’s onboarding document, and the more I learn, the more excited I get. I haven’t felt so uninformed and lost in a technical space before, with all the new technologies and terminologies emerging, but on the other hand I find the prospects really exciting.
I can’t spill all the beans here since some of my investigations are work-related, but I’m keeping Sean’s words in mind, that the best way to get involved in web3 is to be part of the conversation rather than just reading. Shared on twitter earlier this week:
web3 is really making my wheels spin, hard to take my mind off of it. I'm sure someone has already done this, but thinking a lot about digital patina or "programmed patina". The idea that an asset changes over time, either as a function of time or being traded.
— @knandersen@mastodon.social (Kevin Nørby Andersen) (@knandersen) July 11, 2022
web3 is really making my wheels spin, hard to take my mind off of it. I’m sure someone has already done this, but thinking a lot about digital patina or “programmed patina”. The idea that an asset changes over time, either as a function of time or being traded.
I’m curious about the mental model. Like, should it be “patina”, like how copper turns green when oxidised. Or could it be thought about as a “fermentation”? or “aging”, like in fine wine?
Thinking about how when snapchat the short-lived snap, or when twitter launched with a 140 character limit. Both artificial, “unnecessary” limitations, that ended up defining the product. Curious what kind of meaningful limitations web3 could inspire.
Like, could you design limitations that encourage sustainable practices?
These days, I’m still doing sample code projects and watching hours of youtube videos on the subject. Based on my thoughts above, I’d like to try and write a “patinator” that can alter an NFT based on how it lives or is traded. Currently no idea how to do it, but hoping youtube will enlighten me.
Thinking in bits
I stumbled upon this tweet by @fiveoutofnine:
This 1 line of code demonstrates 3 months of research on board representation, piece representation, and chess engines. It checks the legality of any chess knight/king move.
Basically, it’s a very computationally cheap way of calculating a valid chess move. What’s exciting though is that they bothered. Computers have become more capable and cheaper for decades now. Although the web has always aspired to good performance using low bandwidth, it’s nowhere near the kind of efficiency or sensitivity to device constraints that embedded software programming has been practicing all these years.
I started my journey in tech coding in HTML/CSS/JS for the web back in the 90s and later transitioned to Flash. I didn’t get into embedded systems until programming assembler at university, and I didn’t go deep until I started writing firmware and prototyping with STM 8/32-bit microcontrollers with few kilobytes of flash during my time at LEGO. My time at LEGO taught me to be smart about every bit and byte that I used, to a degree I had never had to in the 20 years of programming I had done before that.
Bringing it back to @fiveoutofnine’s tweet. The reason I’m excited is because I think web3 and it’s decentralised nature, which inherently seems more wasteful than centralised systems, could incentivise computationally cheaper algorithms and there by more sustainable computing practices. If you develop a “greener” algorithm and claim a small royalty from it’s use, the world would be better for it. Maybe I’m being naivë, but I think there’s something there.
I’ve been playing the piano since I was a kid, in my teenage years I dabbled in computer music and synthesis, and a couple of years ago I got into modular synthesis. Since I moved back to Denmark and got a piano again, I’ve been curious how to marry the analog and digital music… and how to marry that with my new curiousity with three.js and shader programming.
slightly chaotic setup of laptop, eurorack and mic'ed up piano
These are a couple of experiments where I have recorded the piano, fed it into the eurorack modular synth, and processed it in tone.js using a simple FFT and written a graphical representation in three.js.
Waves001
Waves001 uses the amplitude of all the frequency bins of the FFT as points for a smooth spline curve. The curve is then placed on the scene, and pushes any previous curves further away. source
Experiment with FFT represented as lines
Waves002
Waves002 uses the amplitude of all the frequency bins of the FFT as points on a circular curve. source
FFT represented as points in a circle
At the time of writing, and maybe even still while reading, the code is poor quality.
New blog infrastructure
I also spent the week migrating this blog to a new design and infrastructure. I used to use Gatsby, but it had too much plumbing and code. I wanted something simpler and ended up settling on hugo. Not intuitive, but it’s simpler and I like it so far. Might migrate more of my static sites to it in the future.
It’s been 62 weeks since I last posted - sorry! I want to start posting side projects again soon, but this week I will focus on why I’ve been absent.
62 weeks ago I was interviewing for a position that I ended up accepted. Since October 2021 I have been working in the design team at Bang & Olufsen, and have moved to Copenhagen where our design studio is. I lead a small but expanding team of designers, modelled after my dreams, called Tools & Interactions.
Tools
Tools is the aspiration to build products that feel like tools, and build tools that augment a designers ability to think and make.
There’s an old saying that “if all you have is a hammer, everything looks like a nail”. Or put differently, our work is a result of the tools we use for it.
Interaction design for products that have physical and digital components are notoriously difficult to “sketch” with. Compare it to classic industrial design, where you can sketch simply using pen and paper. My team and I develop tools that allow us to sketch with technology effortlessly at high-fidelity. This enables us to iterate much faster and refine our interfaces.
I’m curious how different “intelligences” can interact in my work. These thoughts are still emerging, but I’m thinking about human, parametric and machine intelligences as tools.
Let’s say human intelligence is my ability to think about the concept of a house, and sketch ideas based on my intuition and experience. Parametric intelligence is my ability to instruct a computer on what a house is, and generate houses based on that model of a house. Machine intelligence is using new and emerging AI systems to expand the concept of what a house is or could be. My point isn’t that one kind of intelligence is superior compared to the other, but rather that they interact with and augment each other.
I’m hiring people who can use parametric and machine learning systems to develop these kind of tools in a way that can inspire our work within industrial design, interaction design and sound design.
Interactions
Interactions is the aspiration to create beautiful relationships between humans and objects, and between objects in a system. At Bang & Olufsen, we have a discipline called CMF - Color, Material, and Finish. That is a testament to how obsessed the company is with how the product looks and feels as a physical object. I like to talk about my team as the “Digital CMF” team. We think about technologies as materials - they have material qualities and affordances.
We work with technology through sketching and making across bits and atoms (which often requires making new tools). To do this, we have built a rapid prototyping workshop with 3D printing, lasercutting, and electronics engineering capabilities.
Did I mention we are hiring? kean [at] bang-olufsen [dot] dk or dm me at @knandersen
This week I thought I’d share more of the process behind the background graphics of superultra.dk.
The idea came from having some sort of interactive element on the front page which builds on the concept of super ultra. The story behind the name is that it’s an ironic take on my design aesthetic. I like simple, minimalistic, understated objects that unfold in use, but I thought it would be fun to use the prefix super ultra for whatever I create - a super ultra toothbrush for example.
To example the idea of unfolding meaning in use, I decided to design something scattered that would form a united whole through interaction. Rhino+Grasshopper are among my favorite tools to sketch in 3D, and I love oscillating between sketching in parametry design, hand drawn CAD, and a physical notebook. I started out by generating “super ultra” as curves, and then used that as the input for my grasshopper sketch. The three versions display the process: curves -> dividing into fragments using a voronoi pattern -> extrusion to give it a third dimension.
curves -> fragmentation -> extrusion
Getting it properly into THREE.js was the biggest challenge, but eventually got it working by exporting to a DAE-format and generating a GLB-file. This allowed me to take each fragment in THREE.js, and scatter them along the Z-axis. I built the scroll-mechanism to assemble the fragments. To encourage interaction and create some ambient motion, I built a timer that starts scattering the fragments even further when not scrolling for a while.
After 3 seconds, the fragments will start moving by themselves
At this point I was happy with the scattered logo, but still felt the site could use more life. One of the main messages of super ultra is designing for life away from screens, so I want the site to feel spatial. I decided to create a particle system that would create the feeling of an atmosphere with dust and debris in it.
The final result can be seen at superultra.dk. At the time of writing, it doesn’t work well on mobile, but I hope to address that in the future.
When scrolling on the website, the pieces come together