Posts

Showing posts from 2015

Java and JSON ain't friends

Image
Most application frameworks provide some REST support, which is – depending on the language you are using – either dirt cheap, or quite complex. In the Java world part of these frameworks is some kind of mapping from JSON to Java and vice versa, most of them using the Jackson mapping framework. It feels quite natural: you model your domain objects directly in Java. If you don’t have any constraints, the JSON might even follow your model. If the JSON is predefined (as part of the API), you can either design your Java classes so they fit the generated JSON, or provide Jackson with some mapping hints to do so. But you know all that, right? So what am I talking about here? The point is: domain models may vary in size from a few properties to x-nested sky high giants… and so are the resulting Java model classes. What makes things even worse, is that domain models change over time. Often you don’t know all the requirements front of, also requirements change over time. So domain models ar...

Use MTOM to Efficiently Transmit Binary Content in SOAP

Currently JSON-based REST services are en vogue, but when it comes to integrating enterprise services, SOAP is still widely used. In a recent project I had to deal with binary content sent to a third party SOAP-Service. Thanks to great tooling, calling a SOAP service is not a big deal. But the binary data varied in size from a few kB to many MB, and this brought up some issues in transmission size and memory usage. That's where MTOM comes to the rescue, a standard for efficiently tranmitting binary data in a SOAP request. This article published on DZone describes, what MTOM can do for you by converting a tiny spring client-server project from default SOAP to MTOM.

Spring Integration Tests with MongoDB rulez

While unit testing is always preferable, integration tests are a good and necessary supplement to either perform end to end tests, or tests involving (third party) backends. Databases are such a candidate where integrations might make sense: usually we encapsulate persistence with some kind of repository service layer, which we can mock in tests running against the repository. But when it comes to testing the repository itself, integration tests are quite useful. Spring integration tests allow you to test functionality against a running Spring application, and thereby allows to test against a running database instance. But as you do in unit tests, you have to perform a proper set up of test data, and clean up the database afterwards. That's what this article published on DZone is about: proper database set- and clean-up in Spring integration tests with MongoDB.

Job DSL Part III

Image
The previous part of this little series on the Job DSL gave you some examples on maintenance, automating the job creation itself and creating views. This last installment will complete the little round trip through the Job DSL with some hints on documentation, tooling and pitfalls. Documentation If you search the internet for the Job DSL one of the first hits will be the corresponding wiki . This is the most valuable source of information. It is well structured and maintained, so new features and missing pieces are filled in regularly. If you are looking for any details on jobs, the job reference is your target. If you like to generate a view, there is a corresponding view reference . Job DSL Source The documentation on the Job DSL is quite extensive, but so is the Job DSL itself. They are steadily closing the gaps, but sometimes a piece of information is missing. A prominent example: enumeration values. There are some attributes, that only accept a...

Job DSL Part II

Image
In the first part of this little series I was talking about some of the difficulties you have to tackle when dealing with microservices, and how the Job DSL Plugin can help you to automate the creation of Jenkins jobs. In today’s installment I will show you some of the benefits in maintenance. Also we will automate the job creation itself, and create some views. Let’s recap what we got so far. We have created our own DSL to describe the microservices. Our build Groovy script iterates over the microservices, and creates a build job for each using the Job DSL. So what if we want to alter our existing jobs? Just give it a try: we’d like to have JUnit test reports in our jobs. All we have to do, is to extend our job DSL a little bit by adding a JUnit publisher: freeStyleJob("${name}-build") { ... steps { maven { mavenInstallation('3.1.1') goals('clean install') } } publishers { archiveJunit('/ta...

Job DSL Part I

Image
Jenkins CI is a great tool for automating your build and deployment pipeline. You set up jobs for build, test, deployment and whatever, and let Jenkins do the work. But there’s a catch. In the recent blog post Bringing in the herd I already talked a bit about the difficulties you have to tackle if you are dealing with microservices: they are like rabbits! When you start with a project, there may be only a couple of microservices, but soon there will be a few dozens or even hundreds.Setting up jobs for these herds is a growing pain you have to the master, and that’s where the Job DSL comes to the rescue. This post is the start of a small series on the Job DSL. One lesson we already learned about microservices is that you have to automate everything. Even – or especially – the configuration of the tooling used to build, deploy, and monitor your application. Not to mention the things you have to do to run the application like distributing, load balancing etc. But let’s start with the b...

Bringing in the herd

Image
Everybody is doing microservices at the time of writing. They promise to solve the problems we had with monolithic architectures: They are easy to deploy, scale, understand, and throw away, they are resilient and may be implemented using different technologies . That’s hell a lot of promises, but there are also downsides: Microservices come in herds, and herds are hard to handle ;-) In our current project we use the Jenkins CI server to implement a continuous integration pipeline. For every microservice we have a couple of jobs: Build : Compile the classes, build a jar, run the JUnit tests ITest : Run the integration tests against the built jar Deploy : Deploy the microservice to the environment These steps are run one after another using the Build Pipeline Plugin . But when it comes to getting an overview to the state of the jobs, you have few choices: The All-View is quite inadequate for that. Even if you have only a couple of dozens of services, there are three job...