Starting a Camunda Project

In the middle of something? Take a listen to the article here – Episode 2 in our newest podcast series!

What are some challenges that I’m likely to face when trying to deliver my first command to project?

Generally speaking, when you are doing a BPM or process orchestration project, the hard part is not articulating out the steps like your business people, your IT people, they know what those should be. The hard part is the nitty gritty of actually talking to the integration steps, like, for instance, you’ll wanna make a restful call, but there’ll be some weird thing in the header and you won’t be able to call it. The team that’s responsible for that won’t be able to meet with you until, let’s say for example, next Tuesday. When you do, they’ll give you a token. You’ll try, and it doesn’t work because you’re in the wrong environment. That kind of stuff drives you nuts. Integrations are hard and they’re hard because they’re imprecise, so the pain is going to be in the nuts and bolts of actually talking to those external systems. Seems like it should be easy, but it’s always painful. Therefore, that’s the place that I would start.

Now, typically what I do of my projects, is I will always stub those out. If there is an APIM system in place like an Apigee, MuleSoft, Kafka or something else, I will use that and I will stub out a restful API call. Then I’ll let the Java folks, .net folks, or whoever, figure out the mechanics. When that’s done, I’ll turn the sprocket and I’ll actually get the payload that I expect. If that’s not in place, then I’ll put in essentially stub, so instead of calling the service to say, “Get a customer from the database”, I’ll actually say, “Hey, what does the customer look like? What does the stub look like?” I will hard code that while the rest of the team is dealing with the technical difficulties of making that integration work.

The reason that I do that is because if I’ve got a 10 step process and the very first step requires that I load a customer, I don’t want to be stopped on working on the rest while this problem gets trafficked. I’ll just take “John Smith” as a customer, and for example, we know that he has 17 fields. (first name, last name, age, etc.) We’ll use that to drive the process. In the meantime, I’m going to have one of my smart guys figuring out why this restful API call doesn’t work. That allows us to make progress on the process side while concurrently working on progress. On the technical integration side, one of the ancillary benefits is that when you’re doing testing, you know how much of your time is being spent on the process and how much of it is being spent on the integration.

How do you know that the payloads will have fidelity and really be true to what you need?

You can imagine a scenario where we have a customer that’s coming back and we expect the last name field to be named, “last name,” but really in the database, it’s called “surname.” When we actually make the thing work, it breaks because we were expecting last name and we got “surname,” and our code doesn’t know how to deal with it. There is no way to know that every single time off the bat.

However, I submit that it doesn’t matter. I submit because the cost of integration is cheap due to the fact we’re stubbing it out. Actually making the change from “last name” to “surname” to “middle name” is something that we can evolve on. These are problems that you can negotiate and that you can solve as you’re going through the system. The trick, as always in the enterprise space for the last 15 to 20 years, is iteration. You do it again. You make some mistakes, shake yourself off, and you do it again, and you make it better, right?

You’ve gotta embrace that philosophy. You deal with a customer and they have 17 fields, and maybe you get six of them wrong. That’s okay. Move forward with what you can, and then fix the other parts of this that is the heart of what I call “shark architecture.” And to me, “shark architecture” means that like a shark, you have to swim in order to be able to breathe. You have to move forward. There is no stopping. We don’t assume that we don’t know what to do because the team is not done with whatever they are dealing with. That doesn’t happen on the projects where I’m involved and I get to keep my sanity.

With all that being said, I would say make some mistakes. Go forward, come back, iterate on it, make it better and better. This isn’t just for delivery, this is for post-delivery. We expect processes to be provable. We expect to be able to make changes, right? We need to work out our discipline and our cadence for how we incorporate change into what we’re doing. The big mantra of Agile programming has always been embrace change. Let’s do that, right? Let’s do this thing that we’ve been talking about all this time. I’m a big proponent of this. Just get in there, get stuff done, make some mistakes, learn from them, and do it better.

How do I deal with the fact that in the real world there’s gonna be a lot of variation of the nature of the data and your approach might lull us into a sense of complacency?

You need synthetic data that is auto-generated by an AI. It is fair to say, for example, that we’re going to bring back “John Smith” and he is going to be a specific payload, but it’s incredibly easy now to say, “Hey, we’re going to bring back one of random 20 customers. One will be John Smith and one will be Joan Stevenson, and so on and so forth.” The data variance can be dynamically changed by an AI. It’s synthetic data that you described using a grammar, and there are open source tools that can do this for you. You can talk to your favorite AI tool. They can do this for you, or you can just code it up. This is not going to be one hundred percent. It’s not going to cover you for every variation, but it is better than nothing.

Remember, we need to move, we need to make progress. I would much rather have a process that breaks when there is no customer identification number than have a process that hasn’t even been implemented because we’re waiting for everything to be perfect.