Intellectual Property Rights and Co-Creative Games

Watching the amazing creativity of so many people at College of Wizardry: Nibelungen 4, and the various wizard school larps I’ve been to in the past, combined with the failure of Dziobak Larp Studios which ran the College of Wizardy series of games, has finally prompted me to put some thoughts to paper.

College of Wizardry 13 in Czocha castle, Poland. #cowlarp #collegeofwizardry #dziobaklarpstudios

Copyrights and intellectual property are a problem with co-creative games and endeavors. In the case of a larp, especially a larp where players and volunteers contribute a great deal, the topic of intellectual property gets very messy very quickly. As long as everyone’s friendly and the sun is shining, there is no problem. But when someone who has contributed a lot to the community decides to leave, or is barred for safety reasons, or when the game is sold or transferred, or the game runners decide to make it a for-profit enterprise, these issues can become a major hurdle.

The traditional creative industry (video games, movies etc.) tends to be pretty clear about who owns what rights; for example, if you are hired as a photographer for a film production, you agree to a work-for-hire contract and the company owns your work, period. (How predatory this model is is an entirely different conversation.) You get paid, someone owns the creative output and has full control over it.

In a game, there is significant creative and professional work that goes into the creation of design documents, meta-techniques, workshop structure, game structure, behind-the-scenes game organization and a great number of other similar aspects. The design of the lore and setting to support the style of game, by picking the location, by building in conflicts, by guiding the types of stories that will be created, and making this mesh with the rest of the game design is the heart of a larp. It’s an art, and the people involved in it deserve their recognition and rewards and to claim their credit. Doing it right is incredibly hard. It’s definitely work worth protection and payment — though on the flip side, the idea that someone asserts copyrights or demands payment for advances in workshops or safety techniques might come across quite poorly in the community.

And yet, no matter how good and complete this design is, it will never be enough to run a game without creative input from others.

College of Wizardry 13 in Czocha castle, Poland. #cowlarp #collegeofwizardry #dziobaklarpstudios

A larp like a College of Wizardry is different from a movie production. One of the most important parts of this kind of a larp is the co-creative aspect. A setting and all the other things mentioned above are provided by the organizers, as are potentially short character descriptions. However, players bring the characters to life and round them out and change them to be something very different from what the character writer might have had in mind.

Players and volunteers create crests, logos, companies, products, props, runes, rituals, songs, outfits, history, customs, in-game magazines, poetry, and countless other creative things. They come up with families for their characters, and backstories. Players may define how fae and work look in a given game, or werewolves, or whether it’s allowed for a teacher and a student to go to the ball together. Players may define school rules. And indeed this is a way to support the re-playability of a game; no two games will take place in the same world, and there will always be something new and different even for returning players.

This is all content created by players and volunteers, and owned by them. It’s the output of their passion. It’s the world they live in, love in, suffer in, find themselves in. If the game runners try to claim that the likeness of a character, or the concept of a character, or any such thing is theirs, and that you, as the player, can’t bring the character to another game, or write fanfiction about them to post online, that goes very violently against the sense of what is right and wrong.

This kind of a game can only work when players and volunteers are encouraged to contribute; when they build each other and the game up; when the organizers can use props and posters left behind, when people give their portraits to be used in the following games. It only works then the players of professors bring supplies for their classes, and decorations for their classrooms.

As soon as players feel like they don’t own their work, or the character they’ve been living as, or that the work they’ve put into the game now belongs to someone else to make a profit off of, the passion is extinguished. It’s no longer creating fiction together, it’s no longer contributing to a great, common story. It’s now doing unpaid (indeed, you arguably pay for the privilege of doing this!) work to line someone’s vault with new intellectual property that they can use for profit, and tell the creators they have no control over it anymore.

People are generally happy to see themselves in character portrayed in documentaries and web pages and social media. Attempting to use people for direct marketing, especially for a different game than they prefer playing, quickly gets hackles up, rightfully so — and is legally extremely dubious. The players have not signed model releases, and demanding one as a condition of play is not going to be accepted. Once more, as long as the players believe in what’s being done, they tend to be happy to cooperate, but when the project moves from collaborative fantasy to a business, things change.

Not only that, I fear there’s a shift of expectation. A commercial business with controls over intellectual property, I imagine, would produce a game where players expect to be customers that are catered to. They expect plot, they expect a good experience. That’s in stark contrast to a co-operative game where players understand that the company running it provides the setting, but the game itself continues to be a co-creative endeavor. The introduction of too restrictive intellectual property rights has the potential to harm the co-creative aspects.

It’s clear that the name of the game, and perhaps core elements are intellectual property, and anyone wanting to use them needs to get permission, and possibly a licensing deal for commercial use. Protecting the core intellectual property of the game against someone trying to take it over, or abuse it, is probably a wise precaution as well. It’s also clear that fan works, fan gatherings, and within reason spinoffs have to be allowed, and even encouraged. If a player wants to make team jerseys with a school and house logo for a game, or just in general — maybe there’s a loss of some potential licensing revenue, but there’s much more gained from the passion and advertising and loyalty the players show to the game and the world.

The quickest way to get people to bail and make their own game is to send them a cease-and-desist letter demanding payment and adding arduous conditions when a dozen of them want to get together for a weekend in a cottage and finish some plots in character.

College of Wizardry 13 in Czocha castle, Poland. #cowlarp #collegeofwizardry #dziobaklarpstudios

There is a legitimate worry that the brand will get tarnished or diluted if people can do whatever they want while using the world, and that if there’s a cheap spinoff it may siphon players from the expensive main event. But that’s a balance that has to be dealt with in a more open, creative commons direction than traditional media would — because that’s exactly what this kind of a larp is: a creative commons.

It’s also clear that volunteers can only run the game so many times, and a bunch of volunteers can only make it so big and take so much responsibility. To rent castles and buses and make agreements, and to work on a game four times a year for years, you need a company of some sort, and some reasonable income to run it. I want to be clear — I would love for people to be able to do this professionally or semi-professionally. It’s a very hard problem, how does one balance these competing needs in the existing legal intellectual property landscape. I don’t have answers. I do have a lot of sympathy for people trying to make it work.

It’s impossible to run a larp while paying everyone for their labor; it’s not even close. It has to be a project that excites people, that welcomes and attracts volunteers to pour their hearts and souls into it, a project that has people carrying boxes and setting up candles in the forest from dawn ‘till well past midnight, that has people crafting and sewing. One where once done, people will rally together over something they have created together, and they feel they have a stake and ownership together; if not financially, at least in the way the created world is treated, and the contributions are acknowledged.







Posted by Toivo Voll in College of Wizardry, CoWlarp, Czocha, LARP

City of Stairs by Robert Jackson Bennett

A friend of mine recommended City of Stairs (The Divine Cities #1) to me, and I picked it up not knowing much about what to expect. The genre is alternative world / fantasy, set in a sort of Victorian, or early 1900 era of technology.

In some ways the plot is the best kind of whodunit, starting with a murder investigation that ends up spiraling into something else entirely. There is a lot of dramatic tension and drive, and it’s one of the most “stay up into the night to finish the chapter” books I’ve had a pleasure to read in a while.

While the prose isn’t as gorgeous and lyrical as, say, Rothfuss’s, the writing and setting and plot is clever. Clever in an intellectual sense, clever in the way it dangles shinies in front of the reader to give pause and reflection. Clever in the way that this entirely alien world really isn’t, and judging the characters’s actions and the justness of the world can’t happen without contrasting it with ours.

The characters are maybe not all that deep but they’re interesting and original and good vehicles for exploring all the things the author has to say about things and events.

And all the while the book is a great straightforward mystery/adventure tale to boot, with great pacing. Nitpicking that some of the terms and language are a bit anachronistic feels awfully curmudgeony.

Highly recommended, four stars.

Posted by Toivo Voll in Book Review

The Invisible Library Series by Genevieve Cogman

The series consists of The Invisible Library, The Masked City, The Burning Page and the Lost Plot (at least one additional book is slated for publication later this year.)

The setting ticks so many boxes. We live in a multiverse, although most of the residents of the worlds in this multiverse do not know it. Connecting most of these worlds is The Library, an ancient institution that collects books from the various worlds to preserve knowledge and other reasons. It employs Librarians to acquire these books. There are two other factions — dragons who embody order, and fae who embody chaos.

The protagonist is one of these Librarians and her apprentice, and they find a lot of challenges in their seemingly simple task.

The protagonist is great; while there’s a bit of Mary Sue-ism, she has a great internal dialogue that not only sets up moral decisions, but is also funny.

The setting as a whole builds up so many great characters, plot hooks and places that it seems a pity if they won’t be followed up on. As it stands, there are some that seem to be abandoned half-way through, and I can only hope they will get revisited in the future before the ball of plot becomes too unmanageable.

Another very enjoyable aspect is the prose itself. It flows effortlessly, the dialogue is nice, and the vocabulary is unusually rich.

And yet the books are shy of being great. The Masked City in particular was the weakest of the series for me, as it was filled with cinematic action that got to be too much. The pacing and dramatic tension in general doesn’t seem to quite work, although The Lost Plot is perhaps the best in this regard, so hopefully the future works continue with those lessons learned. Whatever it is, the series has all the ingredients to be great, but so far only achieves goodness. They’re easy books to recommend, but not books that keep me up wanting to finish the chapter.

Three and a half stars for the series, except three for The Masked City.

Posted by Toivo Voll in Book Review

Visit to Munich

This past weekend I made a quick visit to Munich. It’s one of my favorite cities, and it’s within reasonably easy reach from me (about four hours per train). I do need to find a way to find discount train tickets, though!

The first thing I meant to do after I dropped off my bag at the hotel was to find an electronics store I had been to as a child and see what they were selling these days — assuming they were open. Instead I stepped out of the local train into a mass demonstration against the new police powers act. I have to say that seeing such civic involvement to defend people’s privacy and rights was quite emotional. Good for you, Munich!

Eventually I moved towards a late lunch to the nearby Zum Dürnbräu restaurant. Talk about history; the location has been serving travelers food for over 500 years in the same place. It’s currently asparagus season (here in Switzerland the headquarters cafeteria has asparagus weeks, the super market restaurant has asparagus specials, asparagus everywhere!) Consequently the seasonal menu here also offered asparagus dishes. I opted for a chicken dish, which was indeed very good, combining two kinds of asparagus in a cream sauce. Entrees came with complimentary pretzels, and I added a radler to stay hydrated walking around the city, and since it seemed appropriate for the setting.

Chicken with pepper sauce; white and green asparagus in cream sauce; served with red cabbage, onion, mashed potatoes, fried onions, chives, parsley and assorted other spices.

Complimentary pretzels.

Refreshing radler.


The next day was mostly spent at Deutsches Museum.

In short, it’s the world’s best science museum. There may be others that have a bigger collection of a specific thing, but considering the breadth of their collection — aviation, trains, ships, astronomy, chemistry, biology, mining, machinery, computing, mathematics, physics, ceramics and so forth — they’re unrivaled. They have a staggering collection of historic instruments and specimens of a wide variety.

One particular favorite of mine are the classic physics and chemistry hands-on experiments. They haven’t changed much in half a century, but as great experiments that allow you to grasp concepts of physics they’re fantastic. Things like capacitors where you can vary the distance of the plates and insert dielectric materials between them, all the while observing changes to the inter-plate voltage, complete with an explanation of how things are related. Unfortunately many of the classic sections still have rather lacking English translations.

Another favorite are the guided tours. A few of them require registration and an additional fee, but most are free of charge. They range from playing around with liquid nitrogen or microscopy to more detailed walks through specific departments. This time I was one of only two people taking the mining tour through their extensive staged mining section (showing history of mining, and various types of mines); the tour was led by a former miner, and to my surprise many of the exhibits turned out to be operational, as the tour guide operated wagon lifts, water pumps, and excavators. Once more, the tours are usually limited to German.

The second tour was geodesy; here I was the only person who showed up, so I got a pretty personal tour through a number of the items on display, and discussion about local history as it related to mapping and cartography in the middle ages.

The final guided event for me was the microscopy presentation. This had a lot less to do with actual technology, and was mostly about showing interesting things imaged with the museums scanning electron microscope. The presenter was funny and interactive, he took questions and adjusted what he showed based on the interest of the audience. We did get a brief demonstration of the live view and capabilities of the electron microscope with samples in real time, and ended things by preparing a piece of dried moss in an optical microscope, finding a tardigrade, and waking it up. Overall the session was supposed to take less than an hour, but we spent closer to two hours at it, and I was convinced to buy the museum’s book on the topic, as they are actively involved in using their instrument to do research with other organizations in the region, and independent research on their own. It turns out there is an amazing amount of stuff we do not know about sub-millimeter animals.

For dinner I stopped at a recommended vegan kebab joint, Erbil’s. Most of the fare was what you’d expect in your average kebab restaurant, except meat dishes were made with seitan; it’s even cooked on a vertical rotisserie. In addition there were other delicacies, vegan lasagne, desserts etc. 

Falafel at Erbils.

For my final day I visited the Deutsches Museum’s new traffic annex. While new and offering a fair bit more space, I was not quite as impressed by it. Showcased were old trains, subway cars, trucks, cars, and motorcycles, but I felt like there was not as much information on some of the topics as I would have liked. Their selection of bicycles, I have to mention, was quite impressive, from the earliest to modern, including a reproduction of a traditional bicycle workshop. As the main museum is undergoing renovations, expected to finish in 2020, some of the exhibits were being moved around; there were a few rocket-powered cars from the rocketry exhibit in the traffic annex, though with next to no additional information. They also had a section of train signals, but with no good explanation on what they meant or signaled. On the other hand, they did have interactive exhibits on hydraulic torque converters, different types of transmissions, differentials and brake systems. The star there was a full-sized 6×6 truck drivetrain with cleverly placed plexiglass windows which allowed visibility into the operation of all the components from engine to wheels.

After the museum visit, I had the good fortune of meeting up with some friends at the Hungriges Herz bistro, followed by ice cream at True & 12.

Overall a great way to spend a few days.

Marienplatz in Munich.

Posted by Toivo Voll in Travel

Hurricane — The Fears

a CBP Air and Marine black hawk aircrew works to bring a surviving family into the aircraft after being hoisted to safety.   August 30, 2017 Photo by Alexander Zamora

a CBP Air and Marine black hawk aircrew works to bring a surviving family into the aircraft after being hoisted to safety. August 30, 2017
Photo by Alexander Zamora

This is the first in a multi-part posting about life with hurricanes. I’ve added a few explicit details to non-US audiences.

When you live in Florida, the threat of hurricanes is a part of life. As far as natural disasters go, they’re not too bad; there’s typically plenty of warning so you can prepare or evacuate, and unless you live in a flood-prone area or near the shore, the danger is manageable. Nonetheless, there is that little reminder just lurking in the far corners of your mind reminding you that you’re living here at the mercy of Mother Nature.

But what exactly is that threat?

Fundamentally it is fear for both one’s literal life and for one’s figurative life. Being hurt by the direct impact of the storm,  subsequent flooding, looting and violent crime after the storm on one hand, and losing one’s possessions, and the resulting emotional pain and the economic consequences on the other. For people with families and pets, this fear extends to their loved ones. How high the risk of these things is depends a lot on whom you ask, and often people’s perception doesn’t match with reality. Overall, being hit by a major hurricane in any one location in mainland Florida, especially on the Gulf Coast, is relatively low. Tampa Bay for example has not been hit since 1921, although both Charley and Irma were very close calls.

I live in a house built in 2007, so it incorporates all the updates to hurricane building codes following both the devastation from Andrew and the 2004-2005 hurricane seasons. This means the roof isn’t likely to go flying off, and the windows won’t blow in or shatter, and the structure should stay sound. I’m inland, and my lot hast not flooded in the past decade, and I’m not listed in even the latest flood plain maps as being in a flood zone. For anything below a category 4 hurricane the house should be just fine unless I get unlucky and some object comes flying through my window or patio doors, or we get a truly massive amount of rainfall, in which case any place can suffer from flash floods.

I have insurance, but hurricane insurance has very high deductibles, in the thousands of dollars, it’s only really useful against catastrophic loss. Worse, much of the damage may be caused by water (your window or roof breaks, and the house is inundated by driving rain) and unless you have flood insurance it may be an uphill battle to fight with the insurance company whether the water damage will or will not be covered.

In the case of Hurricane Irma, my most immediate rational fear was damage to the building, resulting in a lot of hassle and stress, fighting with insurance, finding contractors to repair the house and make it habitable (when millions of other people are competing for their business), finding  a lawyer to deal with insurance (when millions of other people are competing for their help), significant financial loss, and loss of items of emotional importance. The secondary fear was of discomfort and inconvenience; days or weeks with no air conditioning and possible mosquitoes if the windows got broken. Both of these were high-impact/low-probability fears, but ones that I could do realistically do something about, and the magnitude of a bad outcome warranted the caution in my mind. The fear of physical injury or death was pretty far down the list.

Fears of running out of medication, food, water, or not having a place to sleep I had been able to counter by preparing for the storm, and will go into those things in more details in a later post about preparing for a hurricane.

I used to drive by the FEMA camps from Hurricane Charley and saw the years it can take a community to get back on its feet and having all the buildings fixed, so I have no assumptions of workers showing up the week after and get things fixed up in quick order.

I cannot in good conscience end without mentioning that I realize my privilege. I have insurance, and I can survive financially having to spend some nights in a hotel, or having to take Uber to work if my car is damaged, and I can buy supplies ahead of time. I live in a fairly safe community, where I have little fear of looting or not having emergency services available, and I have an employer who will not fire me if I have to stay home to deal with a crisis or evacuate. Indeed, I have a car and the money for gas which allows me to even entertain the idea of evacuating. I have a social support network with the means to lend me housing, tools and other assistance if necessary. Not everyone, including some of my friends, have these capabilities, and to them many of the threats that I do not have to fear are very real.

Posted by Toivo Voll in Hurricane

The Case for Desired-State Configuration and YANG

I’ve been spending a while now using Solarwinds Orion configuration scripts to harmonize and update configurations on a large legacy network, so the significant limitations of the current model of network configuration are fresh on my mind.

Traditionally a lot of network equipment such as switches and routers are configured via a text file, where each line has configuration settings that get applied, with sometimes nested blocks of sub-configurations. For example, from Cisco:

hostname myfirstswitch
logging host
interface Ethernet0
 ip address
 ip access-group MyAccessList in
 duplex auto
 speed auto
 no shutdown

Here we set the host name, then configure the first Ethernet interface with an IP address, an access list (similar to a firewall or iptables), and set the duplex and speed values, and finally turn the interface on (on some Cisco models there’s a default for an interface to be off, so turn it on you have to turn it not-off.)

Traditional methods of configuring the switch are over a serial port or SSH, or possibly by loading configuration commands in as a file via SCP, TFTP and the like. While the exact details of the complications do change based on the method used, a lot of the basic problem remains.

In my particular case one issue is that there may be random old configuration left. References to DNS servers or logging servers that no longer exist.  It’s easy enough to add a server, but unless you know that there happens to an old entry (logging host the new configuration doesn’t remove the old. So now you’re stuck writing rules to look for configuration statements of that particular format that shouldn’t be there. Certainly doable, but it adds a lot of extra complexity.

Another issue is the order of operations. A traditional example is the above access list. It’s typically a set of “permit” statements followed by an implicit (or explicit) “deny” statement.  So it’s easy to either blank an access list that governs access to the switch and lock yourself out in the middle of the configuration, or apply an access list before it’s defined. Another common issue is re-addressing devices; you change the IP and subnet mask on one line, and the default gate way on another. But changing either may stop your ability to communicate until the other is applied. Once more, there are ways around it, but it still means a lot of extra complexity in planning and scripting. 

You can’t just tell the switch what configuration you want it or its components have. You have to figure out what state it is currently in, and then do a lot of conditional logic to determine how to get it to the state you want it to be. There are of course additional projects to try to abstract some of that complexity, but on some level they just add yet another level of proprietary components and a black box. Some other vendors, and even some Cisco models, allow configuration sessions with roll-backs, confirms, and the ability to do more atomic applications, but that’s still short of ideal.

One attempt to fix this state of affairs is YANG and NETCONF, IETF standards for representing the state in XML or JSON and transferring the state via an RPC mechanism. This approach isn’t perfect either, and isn’t well supported by vendors. One issue is that the capabilities and peculiarities of each platform differ so much that it’s difficult to abstract away. At the very least, though, it allows for a proper desired-state configuration, which would be a fantastic step forward.

It’ll be very interesting to see whether vendors will start supporting the IETF standards or other APIs, but it’s hard for me to see that going forward we wouldn’t quickly start adding APIs for configuration instead of the old SSH and line configurations. It’s equally hard for me to see that we’ll quickly get away from this problem, considering that typical life cycle of networking gear is 10+ years in enterprise networks.

Posted by Toivo Voll in Information Technology

128 Technology and Secure Vector Routing

Photo: Johannes Winger-Lang

I ran across an interesting new company today, and decided to walk through some of the technology.

You can catch the video here.

The basic idea, as far as I can tell, is that you replace or augment your existing routers with the company’s x86-based boxes. You’re not replacing the underlying Internet, despite what some of the claims might lead you to think — instead they have a proprietary encapsulation/tunneling technology. It’s a lot like a dynamic multi-point VPN, where your traffic moves from one node in your network to another over encrypted tunnels, except here the system builds “tunnels” based on sessions and flows rather than network nodes. What makes the technology really interesting, though, is that it seeks to combine many functions that you get when you maintain a lot of state and know more about your traffic and flows.

In addition to encrypting traffic from point A to point B, it allows you to do traffic engineering / optimization in the vein of SD-WAN — the presentation doesn’t go into full detail, but it’s easy to think of ways that you could optimize for cost and bandwidth, and if application-aware, send VoIP and media streams over low-latency, expensive links and bulk traffic over higher-latency but cheaper links, for example; or shift traffic patterns, allow for overflow peaking to metered links and so forth.

Simply offering an easy-to-manage multipoint VPN — which is currently a major headache that takes a lot of engineer hours to implement — and SD-WAN — which saves money — is a winner, but they aim higher.

If the system knows flows and applications, it’s an easy jump to add security functions to it — firewalls, possibly even IPS/IDS/DLP. Perhaps traffic shaping and policing as well.

There’s a lot of telemetry and visibility that is possible from a modern system that has flow and application-level visibility at every hop. It’s not that current routers couldn’t do this, but they’re badly hamstrung by lagging legacy management schemes such as SNMP.

Configuration of traffic patterns, routing, IP addressing etc. can be done centrally, in the vein of overlays and SDN.

No need to reconfigure anything on the underlying network. The idea that you don’t want to have to ask carriers for anything is pervasive, and it’s attractive for a reason as anyone who’s ever dealt with carriers can attest.

An x86-hardware agnostic approach might allow for a nice range from affordable to high-performance hardware to support many low-cost branches.

High-touch services on the routers? If Cisco is putting container support in their LAN access switches and routers, this may be the way to go.

Where’s the catch? Well, a lot of these things aren’t exactly new ideas, and the difference between wanting to do something and being able to do so is fundamental. Making firewalls is hard. Coming up with a way to route and prioritize traffic is hard even before you add more complex decision criteria to it. Troubleshooting underlying transport issues and how they present through this vector-routed mesh might be a challenge. A particular detail I’m curious about is whether the scheme requires either a transport MTU of more than 1500 bytes, or if it limits the TCP/UDP payload. It says it’s inband signaling and doesn’t have the complexity of MPLS, but it’s still an encapsulation with effectively another set of headers, unless they have a surefire way to compress every packet enough. How is the reliability, and how does it deal with outages of underlying networks?

With the advent of SD-WAN, NSX, ACI, and the already boringly old MPLS infrastructure the engineering and conceptual framework for something like this might be there, though. It does seem to me that if they can deliver on their promises, this would be the perfect time to offer any distributed businesses a simple, single-vendor solution that replaces dozens of expensive, complex, difficult-to-manage products with one centrally managed, software-defined networking stack.

Posted by Toivo Voll in Information Technology

Hidden Figures

Hidden Figures

During a recent flight I finally had a chance to watch Hidden Figures, the movie about the untold story of the black computers who were instrumental in NASA’s manned space flight program. They fought dehumanizing, demoralizing racism and even so managed to make key contributions and keep their pride and hope.

In short, the movie should be seen by everyone, and I’m glad to hear that it’s becoming part of school curricula. It touches on a lot of topics I care about. It shows the beauty and importance of mathematics, and dispenses with the idea that mathematics isn’t for everyone. I am slightly bothered by the genius-worship in the movie, but that’s a minor niggle. It talks about the incredible efforts that went into sending humans into space, and it finally recognizes the important role of people who had been written out of white-washed history.

The portrayal of racism was matter-of-fact which in many ways lessened the immediate emotional impact until you actually thought about what just happened. Not a ton of subtlety either, but some pointed and clever dialogue:

White woman: “You know I don’t have anything against you.”
Black woman: “I know. I know you believe that.”

The protagonists were lionized as perfect; it shouldn’t take that to be respected, and the relatively happy ending also seemed arguable in contrast to real history.

The protagonists were/are heroes, and they broke down barriers. I am immensely glad the movie celebrates this and gives them recognition. Yet I’m hoping nobody thinks that their accomplishments meant that others had the same opportunities shortly thereafter, or that the continuation of the very same fight isn’t happening today.

There has not been a magic turning point between then and now that has made everything better. Yes, things have gotten better, and they’ve gotten better because of the tenacity of people who refuse to sit in the back of the bus, refuse to give up their rights, and those who fight for equality.

I’ve chosen the still from the movie on purpose. This very same scene is still playing out in way too many meetings I have been part of in my own field generations later. The next time you find yourself at an IT trade show, or a training class, or a work meeting, look around and consider how many women of color there are. Then consider how much brilliance and contribution is going unused in a world where they can’t, or won’t, be part of our profession. Then consider what you can do to change that.

Posted by Toivo Voll in Information Technology

Ops, DevOps, and the Big Picture

DevOps is the way to IT nirvana, magically conjuring up the unicorns of agility, reliability, efficiency, engaged employees, and accelerated implementation schedules.

…Except we know the reality. It’s not about the tools or gimmicks – DevOps is a philosophy, and if an organization is built to, or transforms to follow the right principles those goals can be reached. 

I want to address one particular aspect of this philosophy that I have not seen discussed enough – the Big Picture. There are two sides to this.

The Goal

Back to basics! The point of all the DevOps magic is to achieve some goal in a more efficient manner. Unless the developers and operations teams know what this goal is, and how they can contribute to it,  the wrong initiatives and work get prioritized, and the work done is less meaningful in the end. Surprise, there’s nothing dev-opsy about this, it’s all about good, traditional communication and management. The need for clear direction, planning, and execution and communicating it down the organizational structure doesn’t vanish even if the engineers are wizards. Make sure development and operations engineers have a good idea what the organization is trying to do, and let them find ways to contribute. It means better focused work, and more meaningfully engaged employees.


Having a two-way communication between development and ops isn’t really a DevOps thing either; it’s clearly a part of a healthy ITIL, Site Reliability Engineering, Agile, DevOps or pick-your-methodology organization. 

It matters for multiple reasons.

First, if there is a division in labor between the operational tasks and the development-oriented tasks, communication helps foster better cooperation between the respective employees and groups.

Second, it makes for better solutions – having operations weigh in on what kinds of things make life easier and reduce unnecessary work and engineering on the front end of projects can be incredibly helpful. (*cough* Proper application-level HA. *cough*)

Third, it is a sign of a healthy organization and makes the role of an operations engineer more rewarding. If the operations engineers don’t have the time or the opportunity to get involved on the front-end of projects, it either means that they’re overworked, or that they’re getting the mushroom treatment and handed projects they have to magically make work without having been able to influence the design. Both of these are significant red flags.


Like so many parts of the DevOps fever, once you unpack the principles behind it, it turns out that there are some good, common-sense ideas at play. It’s not that DevOps is conceptually that different from the ITIL wheels of continuous improvement, it’s more about figuring out how to actually allow that ideal to be reached without falling into the process morass ITIL brought us. Alternatively, in places where strict controls and processes are unavoidable, there are still great lessons to be learned from DevOps and Agile methodologies. 

Posted by Toivo Voll in DevOps, Information Technology

The Goal and The Phoenix Project


I’m reviewing these two books together, since the Phoenix Project builds largely on The Goal by applying the Theory of Constraints to the IT environment.

The Goal by Eliyahu M. Goldratt is familiar to anyone who has studied management. It tells a fictional story, following a protagonist struggling with production problems at a manufacturing plant. By following the protagonist’s journey in understanding the problem definition, the mechanisms in action, and how to improve the situation the reader gains the same information, is guided through the logic and thought process, and the theory is applied to (fictional) practice. It’s not the most riveting piece of fiction ever written, and Goldratt spends too much time showing just how bad the problem is and the protagonist’s frustration before moving onto the enlightenment steps. That said, it’s certainly a much more pleasant and effective way to convey the concepts Goldratt wants to share than a traditional theory book would be; much like a a textbook it does require the reader to put it down here and there and think through what just happened and was suggested.

The Phoenix Project: A Novel about IT, DevOps, and Helping Your Business Win by Gene Kim, Kevin Behr, and George Spafford follows the same method, but is objectively a much better book. It starts off with a dysfunctional IT organization within a company. Here it shines by painting a picture of archetypal IT staffers and situations with such skill that anyone who ever has worked in IT may be tempted to replace the characters with names from their own organization. The pain-points are also all too familiar. It moves along at a much faster pace while still succeeding in conveying the principles and theory it sets out to communicate.

The Phoenix Project in particular should be required reading for anyone in IT operations or development to get a better idea of organizations as a whole, especially management. Regardless, helping the reader understand how to be more efficient, and how to spot inefficiencies around them, it is helpful no matter the level of an employee. It additionally serves as a great source of references for more reading, such as Personal Kanban, The Five Dysfunctions of a Team, and The Goal to name a few.

The Phoenix Project I highly recommend; if you want to get into more nitty-gritty about the Theory of Constraints in still a very accessible work, The Goal is a good follow-up.



Posted by Toivo Voll in Book Review, Information Technology
Load more