GrooveStomp

GrooveStomp Profile Picture

UEFI and Windows 10

My primary desktop environment is some variant of GNU/Linux. The only case where this hasn’t been true is when I was really into PC gaming and was operating off of Windows 7 for that purpose.

Recently I’ve been more interested in privacy. One nice thing you can do several modern GNU/Linux distributions is full disk encryption (FDE). I’ve been using this where possible for over two years now. Unfortunately, FDE requires the GNU/Linux distro to completely take over your HDD. For my purposes, I wanted to dual boot Windows and GNU/Linux and have each of them use FDE. There are many blog posts and articles that a quick internet search will reveal which give insight on how to accomplish this; however I saw no success whatsoever. The problem seems to be exacerbated by the switch to UEFI from BIOS. I’ve been able to install Windows 7 and Manjaro KDE side-by-side without encryption on UEFI, but literally every other combination I’ve tried has had some major failure where I’ve been unable to access one OS or the other. Right now it looks like the only solution I have is to have both OSes unencrypted and to modify Grub to chainload Windows. For now I’m choosing which OS to boot from the UEFI boot menu.

Choosing which OS to boot
UEFI is terrible.
It sounds like UEFI introduces a lot of nice technical upgrades over legacy BIOS mode, but every single time I’ve ever had to know that I’m dealing with UEFI it’s been because there’s a problem preventing me from doing what I want to do. The best thing I can say about legacy BIOS is that I didn’t know anything about BIOS and I didn’t care. I could do what I needed to do to dual-boot or replace my OS or whatever. UEFI completely breaks this capabability of the computer to simply work. Now I need to know what UEFI is and how UEFI works in order to try and configure my computer how I want. Even then it doesn’t usually work; and a cursory search seems to indicate that I’m not the only one who’s encountered these problems. To top it off, I’ve technically found a workable solution for what I want (though both my logical volumes are now unencrypted…) but I have to select which OS to boot from the UEFI boot menu. To do this I can’t use my regular keyboard (A Kinesis Advantage) so I need to have a secondary keyboard plugged in. This situation is full of so much fail and is so obviously worse than where things were before. Alas, such seems to be the way of things.

Windows
It’s pretty cool to shit on Windows these days. The web is cool, everybody does web development, and all web developers use a unixy environment. (Likely some variant of Mac, though there are a few of us GNU/Linux stalwarts kicking around.) Windows is pretty awesome, though. I’m not a die-hard Windows guy, so I’m not intimately familiar with how awesome Windows is (or, conversely, the extent to which it is awful), but there are a few obvious things:

Again, very high level as I’m not a Windows user so I don’t have an in-depth understanding of these things. These are just the obvious bits that I am aware of.

Installing Windows 10 is awful.
It is literally the most awful OS installation procedure I’ve ever gone through.
First of all, how do you get Windows 10? Buy a CD? Get a downloadable image? Nope! You need to buy a new computer with it already installed, or upgrade from an older OS. You literally cannot get an installable image of Windows 10 for an old computer.
Okay, sure. That’s terrible, but let’s go along with it.

I’ll just install Windows 7 here and then get the Windows 10 installer and update. No problem.. This doesn’t work! I’m serious, you can’t do that. First you have to install every single update that you can. In my case this was over 117 required updates taking up over 1GB of bandwidth. This is required before you can install Windows 10.
Okay, that is seriously bad, but let’s go along with it.

Okay, two updates simply fail over and over. I also can’t install all the updates at once for some reason, so I select a subset and install those in one go. I do about 30 at a time, and every single group requires a full restart.
This is really, really awful, but fine, whatever, we’re pretty used to that by now.

Now I can install Windows 10, right? Sure, let’s run the installer. Going… fine, screen, click through; okay, now I have to insert a product key. I don’t have a Windows 10 product key, so I use my Windows 7 product key, right? I literally have no idea and there’s no documentation on this that is anywhere obvious. Let’s try it.
Nope. Doesn’t work.
Okay, let’s search and see what we can find. Hmm. Looks like the Windows 7 product key should work fine. You actually need to update your entire OS first before you try to install Windows 10, and then you won’t be prompted for a product key because Windows knows your already activated Windows OS that you’re running and will register that with your motherboard so you’re automatically good to go.
Except that is clearly not happening. I am very clearly being asked for a product key despite being as up to date as I can be. Okay, whatever, let’s find some help. Oh, look, a Youtube video that says to just use this random product key to proceed. Okay, that works. Great. Now Windows 10 is installing. Awesome. It’s finished. Okay, now let’s just change the background because the default one is garish. Oh. Windows 10 isn’t activated. Hmm. Okay, activate it. Oh, I have to input a product key. Okay… Oh, Windows 10 doesn’t accept my Windows 7 product key.
THIS IS LITERALLY SO AWESOME THAT MY MIND HAS BROKEN.

So, I opened a support chat with Microsoft. A customer support agent used remote desktop to troubleshoot things. It looks like they mostly just enabled a bunch of services that I disabled when installing Windows 10 and that essentially allowed Windows 10 to update fully and activate my key. I’m sure I missed something there, but that’s certainly what it looked like.
Yes, it looked like during my Windows 10 install, configuring Windows 10 to not automatically share all my data with Microsoft broke the installation completely.
This entire process is so broken. How did any of this get okayed?

Well, it seems like I might be in a position now to finish configuring my desktop. It’s only taken three days so far. The first step will be trying to get a bootable image out of Windows 10 in case things get screwed up along the way to final configuration. Wish me luck.

An Update On Self-Hosting Progress

Well, it seems I’m making very slow progress towards my goal of self-hosting. The first issue I’ve run into is with officially support OSes. I like running Arch Linux, but Arch Linux is not officially supported by either OwnCloud or GitLab. GitLab is the more demanding of the two applications, requiring a minimum of 2GB of RAM and strongly recommending 2 cores. Unfortunately, I have Manjaro (an Arch distribution) running on my Dell Mini 10 where I installed GitLab via Yaourt; but that machine only has 1GB of RAM and a single core. Luckily, my Acer Aspire One D270 has 2 cores and 2GB of RAM, and I just finished installing CentOS 7 on there the other day. I’ll see if I can get GitLab running on it. The GitLab installation instructions are not at all breezy, though; so we’ll see how far I make it this week. In the meantime, I’ll continue searching for a Debian 7 or CentOS 6.5 image for the 32bit Dell Mini 10 so I can start installing OwnCloud on that.

Self Hosting

I’m going to start this post with a little problem I created for myself.

So many computers

Let’s see here. I have a gaming desktop at home, a work laptop at work, a netbook at home for resource-constrained programming and a Raspberry Pi. That’s cool. Daniela also has her own computer, but that’s hers and I simply leave it be. Two weeks ago I caught myself watching The 8-Bit Guy talking about the Macbook Core Duo and seeing if it’s obsolete1. I was really impressed by how capable the machine still is, and also had the Libreboot project sitting in the back of my mind. Specifically, Libreboot mentions 2008 and earlier Intel chips possibly being okay… So that’s arriving on Thursday, presumably. Oh, and I just bought another netbook off of a coworker.

Maybe, just maybe I have a problem.

Did I happen to mention that I’m looking to get a replica Altair 8800, too?2

Power Usage

Okay, time for a little tangent.

One interesting thing about all these different computers is how much power they draw and what they can be used for. Initially I picked up my first netbook for resource-constrained programming, thinking I could do some software rendering optimizations on it. As my collection of computers has grown, my goals for each machine have changed. I now envision an army of servers aided by a couple of special purpose machines. Well, maybe not an army, but definitely two servers (each netbook) plus hooking up my Raspberry Pi as a torrent machine. My desktop will change once again to a hybrid gaming/programming machine - granted I can ever wrangle UEFI dual boot with full disk encryption; and my work machine will remain as-is.

I think the netbooks are great as potential servers because they have pretty low power draw compared to my mammoth desktop. According to some quick searches and rough calculations, I’m looking at power draws of 10-22W, 45W and 3.5W respectively, for each of my Dell Mini 10 (netbook 1), Acer Aspire One (netbook 2) and Raspberry Pi. Compare that with my 500+W desktop with a power supply so large that I know it’s more than 500W, I just can’t remember how much more.

Privacy

The new Macbook is interesting. It is apparently supported by Libreboot.3 I actually didn’t realize this before splurging on it, but it has turned out to be a happy coincidence. I mean, I did have suspicions it would work out… But why do I care about Libreboot? Have you heard about Intel’s Management Engine?4 5 Or how about AMD’s PSP?6 Basically, we’re all screwed. As my good friend Lucas says, “Encrypt Everything.”7 At work I’ve already configured DNSCrypt, and you may have noticed that my website is also served via HTTPS now. This is all part of the big plan.

Self Hosting

Right now I host my website via GitHub pages. I also host all my code projects on BitBucket and GitHub. I have photos up on SmugMug and I stream videos from Netflix. That last one will probably remain, but I will definitely change the others.

Here’s the plan: My brother recently brought to my attention that my router is supported by OpenWRT8. Great!

  1: Install OpenWRT onto router
  2: Install OpenVPN9 onto router
  3: Install DNSCrypt10 onto router

That’ll set up my home environment so all of our locally networked devices have a secure, private connection out to the wide internet. Now, remember all my netbooks? Yeah, those fit nicely into the plan.

  4: Setup OwnCloud11 on netbook 2
  5: Self-host my website on netbook 1
  6: Host code projects via GitLab12 on netbook 1

The final piece will be hacking together a gateway so my webserver is accessible publicly. I may use this solution to cover my OwnCloud server as well.

  7: Get a small VPS instance for $5 USD/mo. from Digital Ocean13
  8: Setup SSL keys between my VPS and home machine(s) via Let’s Encrypt14
  9: Forward appropriate requests from my VPS to my home machines via some wizardry

Well, that’s the plan anyway. Oh, I intend to self-host email as well. But, I believe that option is rife with complexities, and I’m reasonably happy with ProtonMail15 so far.

PS: I pay Private Internet Access for access to their non-logging VPN.16


Footnotes

1The 8-Bit Guy: Is It Obsolete?
2Briel Computers Altair 8800 Micro
3Libreboot: Macbook 2,1
4Hackaday: The Trouble With Intel’s Management Engine
5Libreboot: Intel ME
6Libreboot: AMD PSP
7Lucas Amorim: Encrypt Everything
8OpenWRT
9OpenVPN on OpenWRT
10DNSCrypt on OpenWRT
11OwnCloud
12GitLab
13Digital Ocean
14Let’s Encrypt
15ProtonMail
16Private Internet Access

MatthewMatosis

This is a short post. I really like videogames, but I often find myself lacking motivation to play. I think this happens a lot because I know there’s a serious time commitment to actually complete most games and I’m not normally able to play for more than an hour or so.

Well, Dark Souls has been absolutely consuming my free time lately.

I got quite into Dark Souls about a year ago when I picked up a copy for PC. Since then my interest has waxed and waned until lately where I’ve been watching Let’s Plays and Guides.

Well, I found an awesome videogame critic: Matthewmatosis1. I strongly recommend watching his reviews of the Metal Gear2 and Zelda3 series. Both are really awesome. His Zelda series, in particular, really nails pretty much everything I’ve ever thought about those games. He’s also got a great in-depth commentary on Dark Souls4, and a critique of Dark Souls II5.


Footnotes

1Matthewmatosis Youtube page
2Matthewmatosis Metal Gear Solid Reviews
3Matthewmatosis Zelda Reviews
4Matthewmatosis Dark Souls Commentary
5Matthewmatosis Dark Souls 2 Critique

Daily Coding Practice

I’ve mentioned before that I’ve begun what I call “daily coding practice.”1 My inspiration for this comes directly from Mike Acton by way of HandmadeCon2. Mike said that we as serious, professional programmers should be dedicating some slice of time every day to practicing our craft, even if it’s as little as 30 minutes. Now, Mike also emphasized small, throw-away practice, similar to doing scales if you were to practice guitar. You don’t do part of your scales then pick up the next day and continue. I have yet to fully come to terms with this, but I have at least taken inspiration from the “a little bit every day” part of it.

It is not easy to do this.

I’m managing 3-4 times per week. It’s not easy to do, but it is rewarding, and I think I can keep this going for a long time; perhaps even indefinitely. Like I said, I’m not doing throw-away practice like Mike suggested, so sometimes I end up spending 2 or more hours trying to finish off what I’m working on. A couple of times I’ve spent more than 4 hours in the evening. I don’t recommend this - it wears you out very quickly. I am, however, not only working on side projects. I did a little bezier curve rendering on the Pico 83, and I fully intend to do similar projects in the future. I also have plans to work through Project Euler in Rust and other languages. Heck, just take a look at my wiki page!4

There are a lot of ways you can tackle daily coding practice. It depends on how much free time you have, how engaged you are in other projects, what other responsibilities you have, whether you even want to do it. My situation is this: I have family responsibilities and I’m typically thoroughly engaged for the full work day. I have a solid two hours to spare in the evening. This is completely unaccounted for time. There are three really big, obvious things I want to focus on in this two hours:

  1. Family
  2. Health and Fitness
  3. Programming

I’m not perfectly efficient, however, and do find myself unable to concentrate or otherwise unmotivated with some frequency. Note, also, that I’ve also ordered this list in terms of how I value each item’s relative importance. I will always prioritize time with my family if the need is there, and often if the opportunity presents itself. In actuality, I always prioritize #3 over #2, but that’s not what I desire to keep doing and will hopefully be able to report otherwise soon. What I’m ultimately trying to say here is that two hours is not a lot of time in which to perform all three tasks. It’s doable, but it requires good discipline. I lack that discipline but am working on it.

I find it difficult to make really good progress on side projects by divvying up this 2 hour chunk of time. I also spend this time on videogames, and sometimes boardgames. (At least I used to do boardgaming somewhat consistently.5) I align myself with Cal Newport and his thoughts on deep work6 and have found my best progress has come when I’ve been able to dedicate 4+ hours to a project. On a weekday this requires sacrificing something else like my evening stretches (Yes, that #2 above is in addition to stretching) or having a “proper” goodnight routine. A viable alternative I’ve been inadvertently using is to focus on one or two items on a given day, shuffling the remaining items to other days. Maybe I’ll do 2 hours of fitness on Monday, then 2 hours of programming on Tuesday, then 2 hours of programming on Wednesday. Like anything worth doing, it’s not always easy to do and the benefits aren’t obvious or immediate, but I do think it’s a worthwhile endeavor and will see where it leads me.


Footnotes

1 Previous post of mine mentioning daily coding practice. 2015-12-12: Ludum Dare
2 Mike Acton speaking at HandmadeCon
3 Bezier Curves on the Pico-8
4 BitBucket Practice repository wiki page
5 My posts about boardgames
6 About Cal Newport

Ludum Dare (2015-12-12)

Ludum Dare 34 kicks off this weekend. There are two themes that tied: “two button controllers” and “growing.” I was fully intending to participate, but I am having serious doubts about that decision now. I’ve managed to build the tiniest of skeletons of an application, but I have so much to do in order to get anywhere near a working game that I know it’s basically infeasible. This is pretty much the reality when you have kids and a full time job and you’re already doing daily programming practice. Most of my responsibilities get kicked off to the weekend. Even then I’m just barely pulling my weight. Anyway, we’ll see.

Chris Franklin commented on Twitter about month-long jams that require much smaller commitments of time on any given day. That sounds pretty good to me. I’ll continue with what I have and see where it goes. Maybe I’ll abandon it after this weekend… maybe I’ll continue with it through December, or maybe I’ll even just continue with it until I get bored of it or decide to finish it off.

It’s a tough call. I get that the point is to just start making something because making something is better than making nothing. I 100% get that. It doesn’t mean it’s easy to overcome those built-in walls, though. :-) And, of course, I see something awesome like this Space Harrier-esque Pico8 game and figure just effing do it in Pico8.

Bernband Review

I’m drawn to the idea of walking simulators. It’s no coincidence that my first “intentional” game review was of Proteus. Modern so-called “AAA” gaming is full of violence, so walking simulators are a nice conceptual departure. I don’t have a lot of experience with walking simulators, but I have played a few, including: Dear Esther, Proteus, Shelter, The Stanley Parable and Gone Home. Even among that small selection, there’s a wide variation of gameplay and themes, so clearly this is a broad genre worth exploring.

To start off with, I really love the visual aesthetic of Bernband. I’m somewhat nostalgic about lo-fi graphics, but it’s not really “my bag” per se. There was a time when my main interest in videogames was seeing bigger, better, shinier graphics. Even these days I’m still rather interested in the continually advancing hedonistic graphical treadmill and like to push my “fantsy pants” Geforce 660Ti to its limits. With all that being said, there’s a strong appeal in lo-fi graphics. I think giving your mind the freedom to fill in the blanks is a big part of it; perhaps it’s even the only part of it. From a purely technological standpoint I’m curious about these kinds of graphics because the burden of implementation appears to be much lower, and I’m a game developer wannabe. Bernband takes advantage of this mental “filling in of the gaps” but simultaneously fails to really push it to do something wonderful. You see musicians repeating the same too-few-frames looping animation and other passive walkers just bumble around with barely any agency. In short, it’s kind of a mixed bag.

My gaming pc is a Linux desktop, and that sometimes presents challenges for gaming. Bernband provides a native Linux client like more and more games are doing these days. Just take a look at the list of Linux-compatible games on Steam. Unfortunately, my experience wasn’t hurdle free. Bernband was written to dynamically link against 32 bit libraries, but my OS is 64 bit, so I had to install 32 bit versions of libglu, libxcursor and libopenal. Actually, the game runs without libopenal 32-bit (without sound) and this is how I first experienced it. Without sound is a little boring, so I improvised. I found a Bernband trailer and then looped the music from that trailer, also known as Habits by DJShrike. This was actually a pretty good experience. After a quick run through, I sorted out the libopenal issues and played with the originally intended sounds. Honestly, I preferred the experience with the looping soundtrack. The in-game sounds are somewhat lo-fi like the visual aesthetic, but also repetitive and lacking character. They’re fine and passible, it’s just that I prefer my experience playing with the soundtrack instead. The looping soundtrack lets your mind wander and improvise all the sounds that you’d expect to hear, in similar style to how the lo-fi visuals let your mind fill in the gaps.

I’m specifically opting to review games that are outside of the norm. My intention is to broaden my experience and escape the narrow bubble in which I live my everyday life. It’s great to absorb myself into the things that I already like, but there is so much to experience that’s simply different. I read books, watch movies and listen to music a little outside of my normal tastes, and now I want to do the same with games. This is an interesting experience, because so-called “walking simulators” are nothing like mainstream games. I struggle with Bernband like I struggled with Proteus. There is a lot to like in the experience, but it seems shallow upon reflection. There are hints of greatness, but what little agency is available quickly highlights how shallow the simulation is. I’ve broadly divided my desired gaming experiences into two categories: experiential and ludic games. Bernband is definitely an experiential game much moreso than a ludic game, but it fails to deliver a particularly noteworthy experience. The setting is excellent and the presentation is mixed, but good; but the game doesn’t actually say anything or go anywhere. It’s very shallow. Despite the tone of what I’ve written here, I still think that’s okay. I largely enjoyed my Bernband experience and would love to see more exploration of that setting.

Proteus Review

The best way I can think of to open this review is by quoting myself: “What a weird, boring, delightful, pleasant game. It is really difficult to feel [it] or describe it in one way.”

Proteus is boring. You walk around and have very little agency in the world. You can chase creatures and not much else. I really wish days could be shortened to 5 minutes instead of the current shortest value of 15 minutes. It’s also very soothing. The ambient music works its way into your subconscious. The chirps, squeaks and squawks of the wildlife are enchanting and harken to real outdoor life. There are moments when the game appears to adjust your walking speed downward, and this calms rather than aggravates.

There is one specific interaction that works for me and represents my favorite gameplay mechanic in the game. Before the seasons change, a collection of sprites gather and swarm, gradually spinning more rapidly as you approach their center. You can get close enough so they reach their maximum speed, but then simultaneously hold back and watch the effects on the world. At this state of maximum chaos I quite enjoyed watching the moon and the stars in the night sky; following them as they sped around the planet housing the island I was exploring. My mind would wander to the heavens, imagining similar - or drastically different landscapes in the comsos - all waiting to be explored.

There are really nice touches in Proteus. I like that there’s a button you can hold down which slowly blackens the screen in a way that looks like you’re closing your eyes. Hold it down long enough and your eyes close completely, eventually kicking you back to the main menu. The snapshot feature is neat. Ultimately snapshots are just game saves, but they feel like a photo album and stepping into them is fluid like exiting to the main menu is. On the Vita you can enable gyroscopic freelook; a feature I found to greatly increase immersion.

In Proteus it’s a pleasure to discover new creatures you never noticed before. Interacting with some creatures has an effect. What other creatures might cause some effect? The buildings and landmarks intrigue. Some are clearly the remnants of previous civilizations. But the whole world still feels mostly empty and pointless. There is joy for sure, but there is also boredom. The more I played, the more I enjoyed what I was doing and started getting hooked into the atmosphere, and the more I appreciated the overall experience. I’ve now played three times, each several months apart. Proteus is an experience that will always be welcome in small doses over long stretches like this.

Inspired to Share Gaming Experiences

This past weekend was a little bit rough. Dee and I were both feeling really under the weather so we pretty much got as little done as possible. Dee took over responsibilities for dropping Lily off with her folks, so I jumped at the opportunity to catch up on a few shows. Among them was Jodorowski’s Dune, a film I found to be rather inspirational.

For a long time now I’ve felt rather disillusioned with modern AAA videogames. My preference is definitely for more lo-fi entries and throwback “Retro” games. Things like 1001 Spikes, Super Meat Boy and more. But that’s one part of it. I’m also really intrigued by experience-based games. Gone Home is one of my favorite games of all time. Games like MirrorMoon EP and Fract OSC are also wonderfully different experiences focused on exploration and wonder instead of violence and linear progression.

I’ve decided to take my inspiration from Jodorowski’s Dune and apply it toward the latter interest. I’m currently curating a list of games falling into both style of games mentioned above. I plan on playing through them and writing up my experiences and thoughts. Concurrently with this I’m going to explore an experience-based game design of my own. I’ve only just begun to do this, so I make no guarantees on delivery whatsoever. However, I am rather personally motivated to do it and I plan to keep it small and focused. I will also be focusing a lot on building technology for it. My hope is that this configuration of passions is sufficient to see the design through to its end.

Well, that’s it for now. See you next time Space Cowboy.

Data and Domain Isolation

Today I present some scattered thoughts on organizing data and logic in programs.

I’m starting with MVC because it’s pervasive and well understood. As a pattern for wrangling complexity, it’s solid because it’s so simple: organize your program into three main components and that’s how you deal with complexity. As a bonus, the file organization of your program can just follow the pattern and use three folders. This simplicity makes it easy for new teammates to understand what’s going on. Because it’s so simple, however, MVC has some limitations you encounter pretty quickly. In the Ruby on Rails community they’ve come up with Service Objects. This is really a modification of MVC to what I’ll call MVCS. Instead of three layers, you now get four. Inevitably MVC winds up stacking a bunch of logic into either the controller layer, or the model layer. MVCS adds Service Objects as a new layer intended solely to house application logic. In my experience, this additional layer works pretty well in practice. While it’s a simplistic bandaid to the also-simplistic MVC pattern, it does improve understandability in the programs that I’ve used it with.

Let’s talk about data. Oragizing and modelling data is almost a non-issue in Rails applications, because everybody just uses ActiveRecord and puts all models into the models/ directory. That’s it; everything lives happily together in a monolithic codebase. No problem. But what about “microservices”? By microservices, I simply mean a collection of applications (usually smaller, though not necessarily “micro” as the name implies) that communicate with each other in place of a single, larger, tightly coupled application.

I am not at all an expert on microservices, but I have some experience working with them. I’ve read almost no literature on the topic, so my perspective is most likely naive in that regard. Before I start talking about data, I will preface that there are many ways to decompose programs into microservices and I’ve only had experience with one particular direction - decomposing based on application domain. Follow me on a detour now, while I set up a sample program that we’ll talk about in terms of microservices.

The Detour

Let’s use a hypothetical music application as an example. Let’s say our application will allow the user to rip CDs, to purchase music from an online source, to upload music, to search for music and it will also provide a recommenation engine. If we were going to decompose based on domain, we might end up with separate applications for each of:

Most likely, such an application would have a single front-end interface to present to the end-user. This is a great starting point. It seems like we’ve got some separate domains that are well defined, based on specific functionality. What implications does this division put on our data model? Here are some thoughts that come to mind:

And some questions that come out of those:

How do we organize our data across these applications? Purchasing music requires access to user purchase history. Recommending music requires access to the user’s existing music collection, which may or may not consist entirely of their purchase history, but probably also would like access to search history and uploads. Purchasing music should probably take into account music that’s already been uploaded by the user. We should allow the user to download music multiple times and to different machines.

End Detour

There are two main methods I can visualize for organizing our data. Option 1: Each application can have all of its data local to itself, in an isolated database. This is optimal for keeping services independent from each other, but incurs costs in terms of replicating data and dealing with stale data. Option 2: We can share all data in a universal database and have each application just grab whatever is needed. This is great for data normalization and accuracy. If there is complex business logic associated with reading and writing data, though, and that business logic lives outside of the database itself, then we run the risk of misusing the data, or of duplicating business logic across multiple applications. There are many hybrid approaches that optimize for different needs. You can have isolated databases that replicate to and from a universal master. You can have services be restricted to only access data they need. You can share business logic as library code. You can even do this latter form in multiple ways that approach language-agnosticism. It’s important to keep in mind that with all of these decisions we’re not necessarily reducing complexity, but we’re trying to isolate it and improve any of: runtime cost, memory cost, development cost, data archival accuracy or any number of other factors. Any such decision is always going to be simpler if all the requirements are laid out clearly and prioritized. This is a situation where it’s particularly important not to engage in architecture astronautism and focus on the constraints that are known, as decisions made here can have unforseen long-term effects that may be difficult to address.

Next time I want to start talking about application communication in a microservice architecture. Data organization and models can have a large impact on long-term costs associated with developing and maintaining a project. Decisions about application communication can amplify those costs (or savings).