The new COBOL program – Cloud formation

The new COBOL program is the cloud formation template, it does not matter that it is an YAML/JSON.

Few years ago, the main drive for bringing JAVA/.NET framework into the shop was to de couple the giant multi thousand line COBOL programs. These programs had a library of key words. This could be used and the main logic for those programs were based of specific Boolean flags or at least the COBOL programs that I migrated were of this kind of as such.

Fast forward to my new world of writing cloud formation templates, these are the next set of monoliths that are a few thousand lines. The few keywords/logics that can be written are called intrinsic function.

There are different sections in a COBOL program, the same applies to a cloud formation template, for example you have a program, divisions, section etc. The same parallels can be applied to a cloud formation template, there is a section for declaring parameters, conditions, resources and the output.

The next item in a cloud formation are the conditions., this is the section where the parameters passed to the cloud formation templates are evaluated, so that right resources can be provisioned. If you have seen a COBOL program flag logic, this section is the same.

The resources section of the cloud formation template uses the conditions that you defined earlier to provision the resources that is needed.

The output section of the cloud formation stitches all these above sections so that there is an user friendly output.

All these sections are closely tied together, if one fails or errs out, then the entire stack is a dud.

Thinking about doing a cloud formation template, as an engineer it is a challenging feeling to support this development practice. Is this the correct thing to do for the modern world? Cheers.

JAVA 16 and some Spring Boot lessons

JAVA 16 and some Spring Boot lessons

Some interesting observations when upgrading boot to JAVA 16. I use IntelliJ for my development.

  • Make sure the pom file is updated with latest JDK version

<properties>
<java.version>16</java.version>
</properties>

  • Boot version that was compatible with 2.4.4. Upgrade of boot prevents this following error

Caused by: org.springframework.core.NestedIOException: ASM ClassReader failed to parse class file - probably due to a new Java class file version that isn't supported yet: file [… HiringServiceImplTest.class]; nested exception is java.lang.IllegalArgumentException: Unsupported class file major version 60
Caused by: java.lang.IllegalArgumentException: Unsupported class file major version 60

  • Updated with new version of Boot 2.4.4.

    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>2.4.4</version>
    <relativePath/> <!-- lookup parent from repository -->

Branch for this changes can be found here

https://github.com/sathishjayapal/hireme.git

Azure functions Photo upload function using JAVA

We will go over a working example that uploads a photo to Azure Container as a Blob. The final codebase will run to upload a photo-based on a CURL command

curl -w "\n" http://localhost:7071/api/uploadSathishPhoto --data /Users/sathishjayapal/Downloads/DSCN4934.JPG

High level architecture

Prerequisites:

Azure CLI

Azure subscription

NPM

JAVA 8.

Maven CLI

Now let us see how to go about getting this to work with Azure functions. The reason I chose Azure functions is the free tier that you get for doing a quick app. The first million calls are free and there is a 5GB free space in the Azure storage. The API is front-ended with Azure API services. Few configurations can be done, but we will keep that in another post.
The first block we are going to look is the Function, to get this going, we will use the maven archetype for developing Azure functions.

mvn archetype:generate -DarchetypeGroupId=com.microsoft.azure -DarchetypeArtifactId=azure-functions-archetype

This creates a bare Azure function that can get an Http Post call to a URL. Two files will be created from the archetype. The first one is host.json and local.settings.json. Now the other important item to look is the pom.xml file.

 <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <azure.functions.maven.plugin.version>1.3.4</azure.functions.maven.plugin.version>   <azure.functions.java.library.version>1.3.0</azure.functions.java.library.version>
    <functionAppName>uploadSathishPhoto</functionAppName>
    <functionAppRegion>eastus</functionAppRegion>
    <stagingDirectory>${project.build.directory}/azure-functions/${functionAppName}
    </stagingDirectory>    <functionResourceGroup>photofunctionrg</functionResourceGroup>
  </properties>
The function name is uploadSathishPhoto
<functionAppName>uploadSathishPhoto</functionAppName>

We have to make sure the function name here matches with the Java Function API. So here is the JAVA API function

public class UploadSathishPhotoFunction {

  public static final String PHOTOPATH = "photopath";

  @FunctionName("uploadSathishPhoto")
  public HttpResponseMessage run(
      @HttpTrigger(name = "req", methods = {HttpMethod.GET,
          HttpMethod.POST}, authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage<Optional<String>> request,
      final ExecutionContext context) {
    context.getLogger ().info ("Java HTTP trigger processed a request.");

    String photometer;
    photometer = request.getQueryParameters ().get (PHOTOPATH);
    String photograph = request.getBody ().orElse (photometer);
    if (photograph == null) {
      return request.createResponseBuilder (HttpStatus.BAD_REQUEST)
          .body ("Please pass a photo path on the query string or in the request body").build ();
    } else {
    BlobServiceClient blobClient;
    BlobContainerClient container1;
    try {
      blobClient = BlobClientProvider.getBlobClientReference ();
      context.getLogger ().info ("\nCreate container ");
        container1 = createContainer (blobClient);
      context.getLogger ().info ("\n\tUpload a file as a block blob.");
      BlobClientProvider.uploadFileBlocksAsBlockBlob (container1, photograph);
      context.getLogger ().info ("\t\tSuccessfully uploaded the blob.");
    } catch (Throwable e) {
      context.getLogger ()
          .log (Level.SEVERE, "Java HTTP trigger processed a request.", e.fillInStackTrace ());
    }
      return request.createResponseBuilder (HttpStatus.OK).body ("Hello, " + photograph).build ();
    }
  }
}

Now the function relies on the Azure model, where there is a Need for a BlobClient to start the entire upload function. Then there is a container that holds on the blob. Finally, the blob item will be stored. In our application, we have two classes that are going to take care of the initializing a BlobClient and Container. We are going to upload all the pictures in this function to a specific container. The source code for these classes is checked into.

To run the function run the maven commands

mvn clean install
mvn azure-functions:run 

When doing the install, keep an eye on When doing the install, keep an eye on the console output to make sure there is an Azure function being built. This gives an indication to us that the function is being packaged.

[INFO] Step 1 of 7: Searching for Azure Functions entry points
[INFO] 1 Azure Functions entry point(s) found.

In order to run the application locally, the maven command is

 mvn azure-functions:run

Keep an eye on the console to make sure the function is started


uploadSathishPhoto: [GET,POST] http://localhost:7071/api/uploadSathishPhoto

Hosting environment: Production
Content root path: /Users/sathishjayapal/IdeaProjects/blob-azfunction-start/target/azure-functions/uploadSathishPhoto
Now listening on: http://0.0.0.0:7071
Application started. Press Ctrl+C to shut down.
[01/13/2020 02:47:54] Host lock lease acquired by instance ID '0000000000000000000000004DD021D9'.

To check if all works, CURL command is

curl -w "\n" http://localhost:7071/api/uploadSathishPhoto --data /Users/sathishjayapal/Downloads/DSCN4934.JPG

The JPG will get uploaded based on the param that is being passed to the CURL. To confirm let us look at the Azure portal.

The complete code for this can be found here. As well Azure has a great getting started guides. Check these following links

https://github.com/Azure-Samples

This entire post is based on JAVA 8 version. I got some specific errors when running in versions other than JAVA 8.