All posts by Mike Taylor

Support for Uniface in the cloud: a DevOps project

For the last few months we have been working towards adding cloud providers to the Product Availability Matrix (PAM). This project is known internally as Cloud Phase 1 and has proven to be, on the whole, a DevOps project.

DevOps

For us to add support for a platform there are several things that we must do – the most important of which is to test it, on every platform, for every build, to make sure it works. The framework we use to test Uniface is a custom-built application with the imaginative name of RT (Regression Test) and contains tests targeted at proving the Uniface functionality. The tests have been built up or added to as new functionality is added, enhanced or maintained.

Up until the cloud project, the process of building and testing Uniface (and this is a very simplistic description) was to:

  • Create a Uniface installation by collecting the compiled objects from various build output locations (we have both 3GL and Uniface Script)
  • Compile the RT application and tests using the newly created version
  • Run the test suite
  • Analyze the output for failures and if successful
    • Create the installable distribution (E-Dist or Patch)

The testing and building was completed on pre-configured (virtual) machines with databases and other 3rd party applications already installed.

To add a new platform (or versions of existing platform) to our support matrix could mean manually creating a whole new machine, from scratch, that represents the new platform.

To extend support onto cloud platforms, we have some new dimensions to consider

  • The test platform needs to be decoupled from the build machine as we need build in-house and test in the cloud
  • Tests need to run on the same platform (i.e. CentOS) but in different providers (Azure, AWS, …)
  • Support for constantly updating Relational Database Service (RDS) type databases needs to be added
  • The environment needs to be scalable with the ability to run multiple test runs in parallel
  • It has to be easily maintainable

As we are going to be supporting the platforms on various cloud providers, we decided to use DevOps methodologies and the tools most common for this type of work. The process, for each provider and platform, now looks like this:

  • Template machine images are created at regular intervals using Packer. Ansible is used to script the installation of the base packages that are always required
  • Test pipelines are controlled using Jenkins
  • Machine instances (based on the pre-created packer image) and other cloud resources (like networks and storage) are created and destroyed using Terraform
  • Ansible is used to install Uniface from the distribution media and, if needed, overlay the patch we are testing
  • The RT application is installed using rsync and Ansible
  • RT is then executed one test suite at a time with Ansible dynamically configuring the environment
  • Docker containers are used to make available any 3rd party software and services we need for individual tests and they are only started if the current test needs them. Examples of containers we have made available to the test framework are mail server, proxy server, webserver and LDAP server
  • Assets such as log files are returned to the Jenkins server from the cloud based virtual machine using rsync
  • The results from Jenkins and the cloud tests are combined along with the results from our standard internal test run to give an overview of all the test results.

As quite large chunks of the processing are executed repeatedly (e.g. configure and run a test) we have grouped the steps together and wrapped them with make.

As most of the platforms go through the same process we have also been able to parameterize each step. This should mean that a new platform or database to test on, after the distribution becomes available, “could” be a simple as adding a new configuration file.

The result of Phase 1 of the Cloud project is that the Product Availability Matrix has been extended to include new platforms and databases. Internally we also have the benefit of having a much more scalable and extendable testing framework.

The new platforms added to the PAM in 9.7.04 and 10.2.02 by the cloud project:

Uniface Deployment-in-Cloud

In this initial phase, we have been concentrating on the Linux platforms; next (in Phase 2) we will be working on Windows and MS SQL server.

During this process, I have learnt a lot about our test framework and the tests it runs. Much of the work we have undertaken has just been a case of lifting what we already have and scripting its execution. This has not been the case for everything, there have been some challenges. An example of something that has been more complex than expected is testing LDAP. The existing environment would use a single installation of LDAP for every platform being tested. Tests would connect to this server and use it to check the functionality. As the tests are for both read and write, we could only allow one test to be active at a time; other tests and platforms would have to wait until the LDAP server was released and become available before continuing. With the cloud framework, we have an isolated instance of the service for each test that needs it.

The project to bring cloud support to Uniface has been an interesting one. As well as allowing us to add new platforms and providers onto our supported matrix, it has also allowed us to be more scalable and flexible when testing Uniface.

 

Heading towards Uniface 10.3

Since the release of  Uniface 10.2 the topic of custom utilities on the Uniface repository has come up several times during conversations with customers, at user events and in the forums. The plan is that we address at least part of these requirements (making umeta.xml available) in 10.3.

Migrating to Uniface 10
Uniface Entity Editor

Why wait for 10.3? The migration from 9 or 10.2 to 10.3 will require a full migration, an xml export and import. This is something we don’t do in patches or service packs. The reason for the migration is that we are working towards locking the repository to offer a stable platform for customers. By stable I mean one where, for the foreseeable future, we will not require customers to undertake a full migration. We have planned changes and are validating them to make sure we can implement the functionality we would like to deliver. For Work Area Support we need to make sure that, as much as possible, merging is possible. For global objects (USOURCE) we are splitting it in to multiple tables to more closely reflect the data they hold.

With the repository being updated there are some areas in the development environment that also need some attention, we need to ensure they continue to work:

  • The compiler
  • Export/Import
  • The hybrid components
  • xml
  • Migration
  • Create Table Utility (for repository and user tables)

Whilst on the subject of the “Create Table Utility”… We have been thinking how it might fit into the IDE, should it have its own workbench or should we achieve the functionality in another way? There are currently two implementations we are looking at. Firstly, from the command line. This option is how, in the future we will be supplying the scripts for the repository. Getting Uniface to generate the scripts, rather than a static list being supplied with the installation, will mean more deployment options – it will use driver options in the ASN to generate the correct scripts for your environment.

Uniface Scripts
*Example only

Secondly, we are looking at adding a create table menu option to the project editor. With this method it would be possible to collect all the tables you need generating into a project and asking Uniface to generate the scripts for you.

Uniface Table Menu

Uniface 10: What’s happened since the release?

Back in September 2016 we had quite a major event, Uniface 10 was released with the ability to develop and maintain all forms of Uniface applications – Client Server, Web and batch.

Uniface 10

Since the release, and based on lots of feedback from the early adopters, we have continued to actively enhance the IDE with constant incremental improvements. In this blog post, I would like to share with you what these improvements are as well as what we have planned for the near future.

To start, it is probably a good idea to give some high-level topics we have been concentrating on.

Migration

This topic has probably been our primary focus during the continuous updating of v10. We have always had a migration path between Uniface versions automating any updates needed. In version 10 we continue with this concept and as information becomes available, from customers and our own experiences, the migration utilities have been updated to further improve the experience.

Uniface 10: Code migrated from 9 to 10
Uniface 10: Code migrated from 9 to 10
Usability and bug fixing

Performance in large repositories has proven to be an area where we have needed to pay attention and has generated some lively discussions on uniface.info. Although this is an ongoing theme we have already made significant enhancements. The dropdown browse dialogs for the Development Objects (cpt:, ent:, libinc:, etc) will load the information and format the data with considerably less of a delay. Incremental rendering has also been added so that the list becomes available and usable even while extra rows continue to be added. The same techniques and improvements will also be added to the resource browsers in the coming patches.

Uniface 10: Cascading brows dialogs
Uniface 10: Cascading browse dialogs
Embedding the GUI screen painter

Client server development is another area we are enhancing. The first enhancement we are planning and currently working on is embedding the form painter directly into the v10 IDE.

Uniface 10: Embedded form painter taken from a developer's PC
Uniface 10: Embedded form painter taken from a developer’s PC
Runtime enhancements

It is now possible to specify what trigger, accept or quit, will be called when an auto close popup loses focus.

The ability to undeclare a trigger, operation or local proc. This will allow model or previously defined scripts to be excluded from the compile effectively allowing default functionality for a trigger to be re-established.

The ability to call up to a higher-level trigger has been added, this allows such actions as explicitly calling the entity level Detail trigger from the field level detail trigger.

Uniface 10: New popup options
Uniface 10: New popup options

As you can see, we’ve been very busy, and there is a lot more to come.

Uniface mobile – Custom Cordova plugin support

Some would have noticed that this week has seen the release of two new Uniface patches – G302 for 9.7.02 and F102 for 10.2.01. Normally I wouldn’t post about a patch, however, this time, there is something new that has been included that I would like to share with you. It is now possible to include custom plugins into your mobile app.

In 9.7.02 we introduced the ability to access the Buildozer online build services to compile iOS and Android apps. Included in this integration was the facility to select, from a predefined list, a number of plugins to be included and made available, in JavaScript, to your DSP application. The latest patches have increased this support to enable you to also include plugins from third parties or ones you have created yourself.

Looking in the development environment with the application shell type set to Mobile you will see it is possible to set the mobile app’s properties. With these patches, a new field has been added to the properties screen that allows you to specify one or more public (git) repository where the plugins reside. Now, with the plugins selected in application startup shell, you can submit your app to be built and, as part of server processing, the plugin will be downloaded automatically and included within your project.

Mobile app definitions
Mobile app definitions screen

As an example, we have been asked for the ability to interface with Bluetooth to print a label on a wireless connected printer. A quick search on Google (for Cordova Bluetooth plugin) offered me several options that seem to fit the bill one of which, picked at random, is https://github.com/evothings/cordova-ble. If I was to include their repository (https://github.com/evothings/cordova-ble.git) in my Uniface definitions, before building my app, I would be given access to the device’s Bluetooth capabilities using the documented API.

The Uniface G302 and F102 patches also include the latest documentation which has been updated to include this topic.

Inheritance: Why Uniface 10 will save developers a lot of time

As many of you may be aware, we have – for some time – been working on the new version of Uniface, v10. As befits a major version increment, there are quite a few changes in the development environment, as well as to some of the concepts of Uniface. Today I would like to describe one such change, that should help to make development more obvious.

Inheritance, from model to component, has always been a cornerstone of developing an application with Uniface. The model contains the global definition and the component the external variation, the external variation taking a higher precedence over the model when coming to compile. Let’s examine this concept in a little more detail in the area of local procs.

Imagine, if you will, that when using version 9 you have entries defined in the Local Procs Module trigger of a modelled field, and that the field is painted on a component. When you compile and run the component, any calls you have made to the local proc will cause you to execute the entry defined in the model. Now, if you were to create an external variation of the proc and run the component again, you would expect external variation to be used. So given this, the external variation wins out over modelled procs, right? Well, no, not quite! If I were to now paint a new modelled field with yet another version of the local proc defined and lower down the compile order of the component, what happens could be seen as unexpected – the modelled proc of the second field would be used. If the order of compile changes, i.e. the second entity is moved on the component paint tableau, the module that is used could change. This was not the intended behavior.

In version 10 the process has been made much easier to understand – the external variation always takes precedence. All model definitions are compiled before the external variations are overlaid. This change will mean that the compilation becomes far more predictable with less “magic” and mistakes.

There has been another improvement in compilation of local procs – they are now overlaid. Prior to version 10 you could have, in the model, many entries defined in a single trigger, and if you wished to make a change to just one of them in the component, you would have to break inheritance to them all. Effectively, you would duplicate your code into each component where this was the case. Although it is possible to simulate the inheritance using included procs, it can take quite a bit of planning to implement. With version 10 you are now able to overlay just a single proc module, leaving the inheritance for all others.

So how does this look in the component editor of version 10? The first thing you will see is that only the external variation code is displayed, not the modelled code. To enable easy navigation to all the code compiled into the component, a Compiled Module Information (CMI) panel has been added to the editor. It shows all modules compiled into the component. Double clicking on a module in the CMI causes a navigation action and you will be taken to to the definition of the code module – opening a new editor (entity, included proc, etc.) if required. This is a big time saver for the developer.