Tuesday, 1 December 2020

SpringBoot repackaging and maven.failsafe.plugin

There is an interesting issue with the interaction between springboot and the maven failsafe plugin.  Normally both of these plugins have no problems but often in springboot we want to repackage into an executable springboot jar.  The normal way to do this is,

        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
            <version>${spring.boot.version}</version>
            <executions>
                <execution>
                    <goals>
                       <goal>repackage</goal>
                    </goals>
                </execution>
            </executions>
        </plugin>


However, this causes problems with the failsafe plugin not finding some of the classes it needs for testing.  There are two possible solutions to this issue, either is fine.

Change the spring repackage to include a classifier


      <plugin>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-maven-plugin</artifactId>
        <configuration>
          <classifier>exec</classifier>
        </configuration>
        <executions>
          <execution>
            <goals>
              <goal>repackage</goal>
            </goals>
          </execution>
        </executions>
      </plugin>

Manually include the classes in the failsafe plugin


      <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-failsafe-plugin</artifactId>
          <version>2.22.2</version>
          <configuration>
              <additionalClasspathElements>
                  <additionalClasspathElement>${basedir}/target/classes</additionalClasspathElement>
              </additionalClasspathElements>
          </configuration>
          <executions>
              <execution>
                  <id>integration</id>
                  <phase>integration-test</phase>
                  <goals>
                      <goal>integration-test</goal>
                  </goals>
              </execution>
          </executions>
      </plugin>

Cucumber set up for SpringBoot and JUnit 4 or JUnit 5

Cucumber has worked well with pure JUnit 4 projects for quite some time.  The transition to JUnit 5 does change things slightly but the change isn't particularly bad once things are set up. Here are the versions that I've been using for these tests

SpringBoot: 2.4.0
Cucumber: 6.9.0
maven.surefire.plugin: 2.22.2
maven.failsafe.plugin: 2.22.2

JUnit4

pom.xml

The Pom must include the dependencies as follows (versions have been dropped off here)

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-java</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-junit</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-spring</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <scope>test</scope>
        </dependency>

CucumberIT.java

The CucumberIT class is the bootstrap part of the cucumber testing.  We see here the JUnit4 @RunWith.  The feature files are set here to be in the src/tests/resources/features directory.

import io.cucumber.junit.CucumberOptions;
import io.cucumber.junit.Cucumber;
import org.junit.runner.RunWith;

@RunWith(Cucumber.class)
@CucumberOptions(
    features = "src/test/resources/features",
    tags = "",
    plugin = {"pretty", "json:target/cucumber.json"})
public class CucumberIT {}

CucumberSpringContext.java

This is where the spring wiring gets done to make sure that SpringBoot starts.  I've also included @AutoConfigureMockMvc for making rest calls to the spring boot service from the Step definitions but you don't have to include that.  If you have additional spring configuration you can add @Beans into this class but you need to include the @ContextConfiguration spring annotation too.  In previous versions of cucumber before the @CucumberContextConfiguration annotation was available you needed to have a blank cucumber @Before in this class to make sure it was found.

In fact once this is set up it stays the same for JUnit 4 and 5 because it is primarily a spring configuration not a Cucumber one!

import io.cucumber.spring.CucumberContextConfiguration;
import org.springframework.boot.test.autoconfigure.web.servlet.AutoConfigureMockMvc;
import org.springframework.boot.test.context.SpringBootTest;

@CucumberContextConfiguration
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
@AutoConfigureMockMvc
public class CucumberContextTestConfiguration {}

Feature file and properties

As previously mentioned the feature files now go into the src/test/resources/features directory which is referenced in the CucumberIT.  No other properties are necessary for JUnit4


Running with JUnit5

The transition to JUnit5 can happen in two stages.  Firstly just running with JUnit5 can be backwards compatible with all the current JUnit4 annotations and setup with minimal changes.  Here are the changes that need to be made.

Also check out details about the surefire and failsafe plugins which have had problems with JUnit5 before versions 2.22.0 because they can't find JUnit5 tests.

pom.xml

The only differences here are that the junit:junit:4 dependency comes out and the JUnit5 dependencies come in.  Make sure you include the junit-vintage-engine which is what provides the backwards compatibility for JUnit4 annotations, imports etc


        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-java</artifactId>
            <version>${cucumber.version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-junit</artifactId>
            <version>${cucumber.version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-spring</artifactId>
            <version>${cucumber.version}</version>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.junit.jupiter</groupId>
            <artifactId>junit-jupiter</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.junit.vintage</groupId>
            <artifactId>junit-vintage-engine</artifactId>
            <scope>test</scope>
        </dependency>

CucumberIT.java

This stays the same

Feature file and properties

The feature file stays as was because it is still directly referenced by the CucumberIT class.  However, there is now a warning from Cucumber about the cucumber report.  This can be removed by including a new file cucumber.properties in src/test/resources

cucumber.properties

cucumber.publish.enabled=false
cucumber.publish.quiet=true


Full JUnit5

Now for fully transitioning to run cucumber with JUnit 5.  The test discovery mechanism has changed between JUnit4 and JUnit5 which is the reason for much of the following change. There are no longer any @CucumberOptions so these have to be specified in property files instead.

pom.xml

The junit-vantage-engine dependency has gone and the cucumber-java dependency is replaced with a JUnit5 specific one.  Again the specific versions have been ignored here



        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-java</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-junit-platform-engine</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>io.cucumber</groupId>
            <artifactId>cucumber-spring</artifactId>
            <scope>test</scope>
        </dependency>
        <dependency>
            <groupId>org.junit.jupiter</groupId>
            <artifactId>junit-jupiter</artifactId>
            <scope>test</scope>
        </dependency>

CucumberIT.java

The JUnit4 annotations are no longer available so we use the new JUnit5 annotation - note the different import path.  

import io.cucumber.junit.platform.engine.Cucumber;

@Cucumber
public class CucumberIT {}

Feature file and properties

By default the feature files need to be in the same package as the CucumberIT class (the class annotated with @Cucumber) so they are moved.  The cucumber.properties file that we introduced earlier now has to be renamed to junit-platform.properties (but it stays in src/test/resources).  Also, because the @CucumberOptions no longer exists we can include the plugin options here too

junit-platform.properties

cucumber.plugin=pretty,json:target/cucumber.json
cucumber.publish.enabled=false
cucumber.publish.quiet=true

Thursday, 11 June 2020

New Windows Terminal

Windows have recently introduced a new tabbed terminal which allows different terminal types to be used within one main window.  This is a huge improvement over having a number of cmd prompts open or even using Console2.

To make the new terminal even better you can also add in other terminals that you want to open as a tab.  Using this functionality with Git Bash is a good way to go.

Add Git Bash

Open the terminal and click the down arrow in the menu bar.  Choose Settings.  Notepad should open and the settings can now just be edited there.  To add Git Bash as an option edit the settings and simply add the Git Bash settings as below

  {
      // Make changes here to the gitbash profile.
      "guid": "{00000000-0000-0000-0000-000000000001}",
      "name": "Git Bash",
      "commandline": "C:\\Program Files\\Git\\bin\\bash.exe -i -l",
      "hidden": false,
      "icon" : "C:\\Program Files\\Git\\mingw64\\share\\git\\git-for-windows.ico",
      "startingDirectory" : "%USERPROFILE%"
  },

If you want to make Git Bash the default then you just change the defaultValue setting towards the top of the file

    "defaultProfile": "{00000000-0000-0000-0000-000000000001}",

Save the changes and Terminal will automatically update and Git Bash will be available.

Friday, 1 March 2019

AWS, Spring, Localstack

Using the AWS Java client is very straightforward.  Unit testing is also quite simple by just mocking the AWS classes.  However, integration testing is more complicated.  Here is an example of using LocalStack, TestContainers and Spring to wire AWS objects to point to the LocalStack instance.

Localstack: An implementation of AWS which runs locally with natively or in a docker container
TestContainers: A java library that lets a docker container be run locally for testing

Here TestContainers is used to start the localstack docker image so that the AWS calls can be made against it.


Maven dependencies

Using the v2 dependencies for the AWS library requires bringing in the AWS bom (bill of materials) so that any dependency can just be declared and the bom takes care of getting the correct versions of each dependency.  In the example below the S3 and SQS dependencies are configured.

Dependency management & dependencies

  <dependencyManagement>
    <dependencies>
      <dependency>
        <groupId>software.amazon.awssdk</groupId>
        <artifactId>bom</artifactId>
        <version>2.4.11</version>
        <type>pom</type>
        <scope>import</scope>
      </dependency>
    </dependencies>
  </dependencyManagement>

    <dependency>
      <groupId>software.amazon.awssdk</groupId>
      <artifactId>s3</artifactId>
    </dependency>
    <dependency>
      <groupId>software.amazon.awssdk</groupId>
      <artifactId>sqs</artifactId>
    </dependency>

Test dependencies:

    <dependency>
      <groupId>org.testcontainers</groupId>
      <artifactId>testcontainers</artifactId>
      <version>1.10.6</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.testcontainers</groupId>
      <artifactId>localstack</artifactId>
      <version>1.10.6</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>cloud.localstack</groupId>
      <artifactId>localstack-utils</artifactId>
      <version>0.1.18</version>
      <scope>test</scope>
    </dependency>


AWS Configuration

The normal spring configuration for aws clients is very straight forward.  Here is an example of an S3Client and an SqsClient using the AWS v2 objects.

import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.s3.S3Client;
import software.amazon.awssdk.services.sqs.SqsClient;

@Configuration
public class AwsConfiguration {

  @Bean
  public S3Client s3Client(){
    return S3Client.builder().region(Region.EU_WEST_1).build();
  }

  @Bean
  public SqsClient sqsClient(){
    return SqsClient.builder().region(Region.EU_WEST_1).build();
  }
}

These objects will use the default AwsCredentialProvider but this can be overridden here.


TestConfiguration

To create the test configuration we need to start Localstack using TestContainers.  This test configuration starts the Localstack and then uses it to configure the S3Client and SqsClient to point to localstack

import static org.testcontainers.containers.localstack.LocalStackContainer.Service.S3;
import static org.testcontainers.containers.localstack.LocalStackContainer.Service.SQS;

import java.net.URI;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.test.context.TestConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.DependsOn;
import org.testcontainers.containers.localstack.LocalStackContainer;
import org.testcontainers.containers.wait.strategy.DockerHealthcheckWaitStrategy;
import software.amazon.awssdk.auth.credentials.AwsCredentialsProvider;
import software.amazon.awssdk.auth.credentials.AwsCredentialsProviderChain;
import software.amazon.awssdk.auth.credentials.ContainerCredentialsProvider;
import software.amazon.awssdk.auth.credentials.EnvironmentVariableCredentialsProvider;
import software.amazon.awssdk.auth.credentials.InstanceProfileCredentialsProvider;
import software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider;
import software.amazon.awssdk.auth.credentials.SystemPropertyCredentialsProvider;
import software.amazon.awssdk.services.s3.S3Client;
import software.amazon.awssdk.services.s3.model.CreateBucketRequest;
import software.amazon.awssdk.services.sqs.SqsClient;
import software.amazon.awssdk.services.sqs.model.CreateQueueRequest;

@TestConfiguration
public class AwsConfigurationTest {

  @Bean
  public LocalStackContainer localStackContainer() {
    LocalStackContainer localStackContainer = new LocalStackContainer().withServices(SQS, S3);
    localStackContainer.start();
    return localStackContainer;
  }

  @Bean
  public S3Client s3Client() {

    final S3Client client = S3Client.builder()
        .endpointOverride(URI.create(localStackContainer().getEndpointConfiguration(S3).getServiceEndpoint()))
        .build();

    client.createBucket(CreateBucketRequest.builder().bucket("test_bucket").build());

    return client;
  }

  @Bean
  public SqsClient sqsClient() {

    final SqsClient sqs = SqsClient.builder()
        .endpointOverride(URI.create(localStackContainer().getEndpointConfiguration(SQS).getServiceEndpoint()))
        .build();

    sqs.createQueue(CreateQueueRequest.builder().queueName("test_queue").build());

    return sqs;
  }
}




Monday, 26 November 2018

Enzyme Cheat Sheet

Here is a reminder of the enzyme tips I've picked up so I don't forget!

Basics

Mount a react component in a test

import React from 'react'
import { mount } from 'enzyme'

describe('MyReactComponent', () => {
  const $ = mount(
     <MyReactComponent />
  )

  ...
})

Now different expectations and finds can be used to test,

A React Class
expect($.find('MyReactClass').exists()).toBe(false)

An html element
expect($.find('button').exists()).toBe(false)

An html id
expect($.find('#my-id').exists()).toBe(false)

A style
expect($.find('.some-style').exists()).toBe(false)

Html properties
expect($.find('[aria-label="Save"]').exists()).toBe(false)
expect($.find('[title="Save"]').exists()).toBe(false)

Combination - Html element & Property
expect($.find('button[title="Save"]').exists()).toBe(false)

HTML / Text / Contains

Text content.  Find the first of many 'myClass' css classes and check the text of the element
expect($.find('.myclass').text()).toContain("My Value")
expect($.find('.myclass
').text()).not.toContain("My Value")

As above but with an exact text match
expect($.find('.myclass
').text()).toBe("The exact text")
As above but getting the whole html of the element rather than just the text
expect($.find('.myclass').at(0).html()).toBe("<div>something</div>")

Focus

Focus a field with enzyme
const inputField = $.find('#myInput')
inputField.getDOMNode().focus()

Validate that a field has focus
expect(document.activeElement.id).toBe('the id expected')

Props

Find an element and get the html properties
const button = $.find('[aria-label="My Button"]')
expect(button.props().disabled).not.toBe(true)

Once an element has been found the react props can be used for expectations by also using props()
expect($.find('MyReactClass').props().options).toEqual([{option: 'option1'}, {option: 'option2'}])






Monday, 12 November 2018

Formik

Formik is a brilliant React form builder.  It is simple and intuitive.  It includes validation using a third party - I'm using yup below.

Basic Formik

This example below shows the basics of using Formik.  The internal properties and functions of formik take care of all the plumbing.  All you need to do is wire the onChange and onBlur functions to the individual 

<div>
  <Formik
    initialValues={{
      title: '',
      firstName: '',
      surname: ''
    }}
    validationSchema={Yup.object().shape({
      title: Yup.string()
        .trim()
        .required('Please enter a title')
        .max(5, 'Too many characters (Maximum 5 allowed)'),
      firstName: Yup.string()
        .trim()
        .required('Please enter a firstName')
        .max(100, 'Too many characters (Maximum 100 allowed)'),
      surname: Yup.string()
        .trim()
        .required('Please enter a surname')
        .max(100, 'Too many characters (Maximum 100 allowed)'),
      
    })}
    onSubmit={values => alert(values)}
  >
    {formikProps => {
      const {
        values,
        touched,
        errors,
        isSubmitting,
        handleChange,
        handleBlur,
        handleSubmit,
        setFieldValue,
        setFieldTouched
      } = formikProps
      return (
        <form>
          <div>
            <label htmlFor="title">
              <span className="mandatory">Title</span>
              <input
                id="title"
                type="text"
                value={values.title}
                onChange={handleChange}
                onBlur={handleBlur}
                className={`${errors.title && touched.title ? error : ''}`}
              />      
              {errors.title && touched.title && <div>{errors.title}</div>}
            </label>
          </div>

          <div>
            <label htmlFor="firstName">
              <span className="mandatory">First Name</span>
              <input
                id="firstName"
                type="text"
                value={values.firstName}
                onChange={handleChange}
                onBlur={handleBlur}
                className={`${errors.firstName && touched.firstName ? error : ''}`}
              />      
              {errors.firstName && touched.firstName && <div>{errors.firstName}</div>}
            </label>
          </div>

          <div>
            <label htmlFor="surname">
              <span className="mandatory">Surname</span>
              <input
                id="surname"
                type="text"
                value={values.surname}
                onChange={handleChange}
                onBlur={handleBlur}
                className={`${errors.surname && touched.surname ? error : ''}`}
              />      
              {errors.surname && touched.surname && <div>{errors.surname}</div>}
            </label>
          </div>
        </form>
      )
    }}
  </Formik>
</div>


Validate on Edit

By default Formik will validate once the first blur has occurred and then everytime on change.  This gives the user the chance to get the content right first without being bothered by error messages.  However, should you want to validate all changes from the off you can change your inputs to include an onInput function,


            <label htmlFor="surname">

              <span className="mandatory">Surname</span>
              <input
                id="surname"
                type="text"
                value={values.surname}

                onInput={() => setFieldTouched('surname', true)} // Validate as the user types
                onChange={handleChange}
                onBlur={handleBlur}
                className={`${errors.surname && touched.surname ? error : ''}`}
              />      
              {errors.surname && touched.surname && <div>{errors.surname}</div>}
            </label>


Validate Function

Validation doesn't have to be done by a schema.  This is quite limiting as it doesn't allow dynamic messages to be generated.  You can still use a schema as this is really easy syntax but just recreate it each time.  If you use the validate function the Formik 'values' object is passed to it.  This can be used to generate a schema with the current values and adjust the error messages accordingly.  In the example below the validate function calls for a validationSchema.  The schema uses the values and the ` string template indicator to create dynamic error messaages.

getValidateSchema = values => 
  Yup.object().shape({
    title: Yup.string()
      .trim()
      .required('Please enter a title')
      .max(5, `Too many characters (${values.title.length} entered, Maximum 5 allowed)`),
    firstName: Yup.string()
      .trim()
      .required('Please enter a firstName')
      .max(100, `Too many characters (${values.firstName.length} entered, Maximum 100 allowed)`),
    surname: Yup.string()
      .trim()
      .required('Please enter a surname')
      .max(100, `Too many characters (${values.surname.length} entered, Maximum 100 allowed)`),
  })

validate = values => {
  try {
    validateYupSchema(values, this.getValidateSchema(values), true, {})
    return {}
  } catch (error) {
    return yupToFormErrors(error)
  }
}

render = () => 
  <div>
    <Formik
      initialValues={{
        title: '',
        firstName: '',
        surname: ''
      }}
      validate={values => this.validate(values)}
      onSubmit={values => alert(values)}
    >
      {formikProps => {
        const {
          values,
          touched,
          errors,
          isSubmitting,
          handleChange,
          handleBlur,
          handleSubmit,
          setFieldValue,
          setFieldTouched
        } = formikProps
        return (
          <form>
            <div>
              <label htmlFor="title">
                <span className="mandatory">Title</span>
                <input
                  id="title"
                  type="text"
                  value={values.title}
                  onInput={() => setFieldTouched('title', true)}
                  onChange={handleChange}
                  onBlur={handleBlur}
                  className={`${errors.title && touched.title ? error : ''}`}
                />      
                {errors.title && touched.title && <div>{errors.title}</div>}
              </label>
            </div>

            <div>
              <label htmlFor="firstName">
                <span className="mandatory">First Name</span>
                <input
                  id="firstName"
                  type="text"
                  value={values.firstName}
                  onInput={() => setFieldTouched('firstName', true)}
                  onChange={handleChange}
                  onBlur={handleBlur}
                  className={`${errors.firstName && touched.firstName ? error : ''}`}
                />      
                {errors.firstName && touched.firstName && <div>{errors.firstName}</div>}
              </label>
            </div>

            <div>
              <label htmlFor="surname">
                <span className="mandatory">Surname</span>
                <input
                  id="surname"
                  type="text"
                  value={values.surname}
                  onInput={() => setFieldTouched('surname', true)}
                  onChange={handleChange}
                  onBlur={handleBlur}
                  className={`${errors.surname && touched.surname ? error : ''}`}
                />      
                {errors.surname && touched.surname && <div>{errors.surname}</div>}
              </label>
            </div>
          </form>
        )
      }}
    </Formik>
  </div>

Wednesday, 26 September 2018

JSON Date / Time Formatting

The default behaviour when serialising json from a Java Date object is to break it down to its constituent parts eg,

"startDate" : {
    "year" : 2014,
    "month" : "MARCH",
    "dayOfMonth" : 1,
    "dayOfWeek" : "FRIDAY",
    "dayOfYear" : 1,
    "monthValue" : 1,
    "hour" : 2,
    "minute" : 2,
    "second" : 0,
    "nano" : 0,
    "chronology" : {
      "id" : "ISO",
      "calendarType" : "iso8601"
    }
  }

Spring Boot 2

Spring boot already contains the dependencies,

    <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
    </dependency>
    <dependency>
        <groupId>com.fasterxml.jackson.datatype</groupId>
        <artifactId>jackson-datatype-jsr310</artifactId>
    </dependency>

and it'll wire the jsr310 dependency automatically for any ObjectMapper that is @Autowired.  All that is necessary then is to add the property,

    spring.jackson.serialization.WRITE_DATES_AS_TIMESTAMPS=false

Standalone

For standalone code the dependencies above will need to be added.  In the code when the object mapper is created the following is necessary,

    final ObjectMapper objectMapper = new ObjectMapper();
    objectMapper.enable(SerializationFeature.INDENT_OUTPUT);
    objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
    objectMapper.registerModule(new JavaTimeModule());


Deserializer

Of course if you already have a file that is serialized like this it is too late.  In that case you can transform the file using some deserializers.  Below are two deserializers for LocalDate and LocalDateTime to allow you to reconstruct your object.




  public class LocalDateDeserializer extends StdDeserializer<LocalDate>
  {
      public LocalDateDeserializer()
      {
          this(null);
      }

      public LocalDateDeserializer(final Class<LocalDate> t)
      {
          super(t);
      }

      @Override
      public LocalDate deserialize(final JsonParser p, final DeserializationContext ctxt) throws IOException, JsonProcessingException
      {
          JsonNode node = p.getCodec().readTree(p);
          return LocalDate.of(node.get("year").intValue(),
                              node.get("monthValue").intValue(),
                              node.get("dayOfMonth").intValue());
      }
  }

  public class LocalDateTimeDeserializer extends StdDeserializer<LocalDateTime>
  {
      public LocalDateTimeDeserializer()
      {
          this(null);
      }

      public LocalDateTimeDeserializer(final Class<LocalDateTime> t)
      {
          super(t);
      }

      @Override
      public LocalDateTime deserialize(final JsonParser p, final DeserializationContext ctxt) throws IOException, JsonProcessingException
      {
          JsonNode node = p.getCodec().readTree(p);
          return LocalDateTime.of(node.get("year").intValue(),
                                  node.get("monthValue").intValue(),
                                  node.get("dayOfMonth").intValue(),
                                  node.get("hour").intValue(),
                                  node.get("minute").intValue(),
                                  node.get("second").intValue(),
                                  node.get("nano").intValue()
                                  );
      }
  }

These allow you to write code to read in your existing badly formatted json like so,

  final ObjectMapper mapper = new ObjectMapper();

  SimpleModule module = new SimpleModule();
  module.addDeserializer(LocalDate.class, new LocalDateDeserializer());
  module.addDeserializer(LocalDateTime.class, new LocalDateTimeDeserializer());
  mapper.registerModule(module);
  final MyObject myobj = mapper.readValue(new FileInputStream("<file to read in>"), MyObject.class);

  final ObjectMapper dateTimeMapper = new ObjectMapper();
  dateTimeMapper.registerModule(new JavaTimeModule();  dateTimeMapper.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false);
  final FileOutputStream os = new FileOutputStream("<file to write out");
  dateTimeMapper.writeValue(os, myobj);
  os.flush();

  os.close();