Tag Archives: Uniface 10

Support for Uniface in the cloud: a DevOps project

For the last few months we have been working towards adding cloud providers to the Product Availability Matrix (PAM). This project is known internally as Cloud Phase 1 and has proven to be, on the whole, a DevOps project.

DevOps

For us to add support for a platform there are several things that we must do – the most important of which is to test it, on every platform, for every build, to make sure it works. The framework we use to test Uniface is a custom-built application with the imaginative name of RT (Regression Test) and contains tests targeted at proving the Uniface functionality. The tests have been built up or added to as new functionality is added, enhanced or maintained.

Up until the cloud project, the process of building and testing Uniface (and this is a very simplistic description) was to:

  • Create a Uniface installation by collecting the compiled objects from various build output locations (we have both 3GL and Uniface Script)
  • Compile the RT application and tests using the newly created version
  • Run the test suite
  • Analyze the output for failures and if successful
    • Create the installable distribution (E-Dist or Patch)

The testing and building was completed on pre-configured (virtual) machines with databases and other 3rd party applications already installed.

To add a new platform (or versions of existing platform) to our support matrix could mean manually creating a whole new machine, from scratch, that represents the new platform.

To extend support onto cloud platforms, we have some new dimensions to consider

  • The test platform needs to be decoupled from the build machine as we need build in-house and test in the cloud
  • Tests need to run on the same platform (i.e. CentOS) but in different providers (Azure, AWS, …)
  • Support for constantly updating Relational Database Service (RDS) type databases needs to be added
  • The environment needs to be scalable with the ability to run multiple test runs in parallel
  • It has to be easily maintainable

As we are going to be supporting the platforms on various cloud providers, we decided to use DevOps methodologies and the tools most common for this type of work. The process, for each provider and platform, now looks like this:

  • Template machine images are created at regular intervals using Packer. Ansible is used to script the installation of the base packages that are always required
  • Test pipelines are controlled using Jenkins
  • Machine instances (based on the pre-created packer image) and other cloud resources (like networks and storage) are created and destroyed using Terraform
  • Ansible is used to install Uniface from the distribution media and, if needed, overlay the patch we are testing
  • The RT application is installed using rsync and Ansible
  • RT is then executed one test suite at a time with Ansible dynamically configuring the environment
  • Docker containers are used to make available any 3rd party software and services we need for individual tests and they are only started if the current test needs them. Examples of containers we have made available to the test framework are mail server, proxy server, webserver and LDAP server
  • Assets such as log files are returned to the Jenkins server from the cloud based virtual machine using rsync
  • The results from Jenkins and the cloud tests are combined along with the results from our standard internal test run to give an overview of all the test results.

As quite large chunks of the processing are executed repeatedly (e.g. configure and run a test) we have grouped the steps together and wrapped them with make.

As most of the platforms go through the same process we have also been able to parameterize each step. This should mean that a new platform or database to test on, after the distribution becomes available, “could” be a simple as adding a new configuration file.

The result of Phase 1 of the Cloud project is that the Product Availability Matrix has been extended to include new platforms and databases. Internally we also have the benefit of having a much more scalable and extendable testing framework.

The new platforms added to the PAM in 9.7.04 and 10.2.02 by the cloud project:

Uniface Deployment-in-Cloud

In this initial phase, we have been concentrating on the Linux platforms; next (in Phase 2) we will be working on Windows and MS SQL server.

During this process, I have learnt a lot about our test framework and the tests it runs. Much of the work we have undertaken has just been a case of lifting what we already have and scripting its execution. This has not been the case for everything, there have been some challenges. An example of something that has been more complex than expected is testing LDAP. The existing environment would use a single installation of LDAP for every platform being tested. Tests would connect to this server and use it to check the functionality. As the tests are for both read and write, we could only allow one test to be active at a time; other tests and platforms would have to wait until the LDAP server was released and become available before continuing. With the cloud framework, we have an isolated instance of the service for each test that needs it.

The project to bring cloud support to Uniface has been an interesting one. As well as allowing us to add new platforms and providers onto our supported matrix, it has also allowed us to be more scalable and flexible when testing Uniface.

 

Heading towards Uniface 10.3

Since the release of  Uniface 10.2 the topic of custom utilities on the Uniface repository has come up several times during conversations with customers, at user events and in the forums. The plan is that we address at least part of these requirements (making umeta.xml available) in 10.3.

Migrating to Uniface 10
Uniface Entity Editor

Why wait for 10.3? The migration from 9 or 10.2 to 10.3 will require a full migration, an xml export and import. This is something we don’t do in patches or service packs. The reason for the migration is that we are working towards locking the repository to offer a stable platform for customers. By stable I mean one where, for the foreseeable future, we will not require customers to undertake a full migration. We have planned changes and are validating them to make sure we can implement the functionality we would like to deliver. For Work Area Support we need to make sure that, as much as possible, merging is possible. For global objects (USOURCE) we are splitting it in to multiple tables to more closely reflect the data they hold.

With the repository being updated there are some areas in the development environment that also need some attention, we need to ensure they continue to work:

  • The compiler
  • Export/Import
  • The hybrid components
  • xml
  • Migration
  • Create Table Utility (for repository and user tables)

Whilst on the subject of the “Create Table Utility”… We have been thinking how it might fit into the IDE, should it have its own workbench or should we achieve the functionality in another way? There are currently two implementations we are looking at. Firstly, from the command line. This option is how, in the future we will be supplying the scripts for the repository. Getting Uniface to generate the scripts, rather than a static list being supplied with the installation, will mean more deployment options – it will use driver options in the ASN to generate the correct scripts for your environment.

Uniface Scripts
*Example only

Secondly, we are looking at adding a create table menu option to the project editor. With this method it would be possible to collect all the tables you need generating into a project and asking Uniface to generate the scripts for you.

Uniface Table Menu

Uniface 10: What’s happened since the release?

Back in September 2016 we had quite a major event, Uniface 10 was released with the ability to develop and maintain all forms of Uniface applications – Client Server, Web and batch.

Uniface 10

Since the release, and based on lots of feedback from the early adopters, we have continued to actively enhance the IDE with constant incremental improvements. In this blog post, I would like to share with you what these improvements are as well as what we have planned for the near future.

To start, it is probably a good idea to give some high-level topics we have been concentrating on.

Migration

This topic has probably been our primary focus during the continuous updating of v10. We have always had a migration path between Uniface versions automating any updates needed. In version 10 we continue with this concept and as information becomes available, from customers and our own experiences, the migration utilities have been updated to further improve the experience.

Uniface 10: Code migrated from 9 to 10
Uniface 10: Code migrated from 9 to 10
Usability and bug fixing

Performance in large repositories has proven to be an area where we have needed to pay attention and has generated some lively discussions on uniface.info. Although this is an ongoing theme we have already made significant enhancements. The dropdown browse dialogs for the Development Objects (cpt:, ent:, libinc:, etc) will load the information and format the data with considerably less of a delay. Incremental rendering has also been added so that the list becomes available and usable even while extra rows continue to be added. The same techniques and improvements will also be added to the resource browsers in the coming patches.

Uniface 10: Cascading brows dialogs
Uniface 10: Cascading browse dialogs
Embedding the GUI screen painter

Client server development is another area we are enhancing. The first enhancement we are planning and currently working on is embedding the form painter directly into the v10 IDE.

Uniface 10: Embedded form painter taken from a developer's PC
Uniface 10: Embedded form painter taken from a developer’s PC
Runtime enhancements

It is now possible to specify what trigger, accept or quit, will be called when an auto close popup loses focus.

The ability to undeclare a trigger, operation or local proc. This will allow model or previously defined scripts to be excluded from the compile effectively allowing default functionality for a trigger to be re-established.

The ability to call up to a higher-level trigger has been added, this allows such actions as explicitly calling the entity level Detail trigger from the field level detail trigger.

Uniface 10: New popup options
Uniface 10: New popup options

As you can see, we’ve been very busy, and there is a lot more to come.

When is the best time to plant a tree?

When is the best time to plant a tree? According to a Chinese proverb it’s 20 years ago. The second best time is now.

As Uniface developers we know this is true. Most applications written in Uniface originate from 20 years ago. And they are still alive and kicking. Well, I am not sure about the kicking part, but they are certainly alive. But I want to build new applications today. I am sure we all want to.

In previous blog posts I told you about my worries. Some of you replied, or sent me an email. Thanks for that! You told me about these frameworks that existed in the mid-nineties. A good and sound framework is an essential building block 🙂 for fast application development. It’s the foundation of applications, but why should we invent the wheel over and over again? I would rather spend my energy on programming algorithms and code business logic.

But let’s be honest, we need more. I mean more frameworks that can be used to build mobile applications or at least fully responsive web applications with DSP’s.

There are hundreds or thousands of excellent Uniface developers out there. And we need a working space where we can meet and join forces. What if such a place would exist? Where we could create nice tools, examples, pieces of proc code or even a complete framework? Wouldn’t that be great! Let’s join forces and start a new community and have this working space. Interested? I have a plan….

Community

I want to organize a new Uniface developers community. The goal of this community is to build, maintain and share Uniface components. And I am searching for developers to participate. A few assumptions:

  • Let’s start small, with a few developers. Not more than a dozen.
  • It’s an online community, so it doesn’t matter where you live, work or which timezone you’re in.
  • We will communicate in English. My English is not the best, but I try. I am sure we all can.
  • The community and the products are independent. So, the software we create or documents we write are owned by the community.
  • Participation is on a personal basis. So you don’t represent your employer. Not even when it’s Uniface. 🙂
  • Everything we create, we will share for free and is open source. I will write a post about open source in the near future. Because it’s not what we are used to…
  • Last but not least. We will work independently and local most of the time. But a community is about teamspirit. Especially in the beginning. We don’t have to be friends, but we need to respect each other’s opinion.

Rocket science

The community is going to build Uniface components. Of course I am talking about Uniface 10. We will start with some nice examples. That’s also the best to get used to the environment.

All of us know how to build nice applications with Uniface. I don’t know if you have any experience with version control, creating mobile apps or Uniface 10. But I am sure community based development is new for us all. So, at certain points it’s going to be trial and error.

Before we can build anything we need to setup an entire environment where we can work together. Think about the architecture inside your development department, but then completely online.

There is already a complete online environment. All that is missing, is you!

Want to participate? Please send me an email (lammersma@hotmail.com). Don’t worry about the technical stuff. It will be explained to you!

 

How to restore default behavior in Uniface 10

In Uniface 9, there are two types of trigger behavior:

  • Implicit or default Uniface behavior;
  • Explicit behavior

The most well-known example of implicit or default behavior is the execute trigger, which displays the component in edit mode, even if you do not declare it. Explicit behavior is all code that a developer writes. This explicit behavior will block default behavior even if you did not intend to. I cannot count the number of times I created an execute trigger and forgot to explicitly add the edit statement.

In Uniface 9, every trigger has its own code container and is validated by the compiler. In a way, all triggers are defined implicitly. The way to restore default behavior is to replace the (inherited) code in the trigger with a semicolon (sometimes followed by a comment). The semicolon breaks inheritance but does not implement any other explicit behavior. The compiler considers it to be empty and will disregard it during compilation. At runtime, Uniface will revert to its default behavior (if it has any).

In Uniface 10, we reduced the number of containers. An entity has a maximum of three containers; one for the declarations; one for the collection operations and one for the occurrence operations (and all other occurrence level modules, such as triggers). Fields, components, and application shells have only two; a Declarations container and a Script container. The Script container is where you store your triggers, operations and entries. There is no way to implicitly define a trigger anymore. Every trigger is either inherited or explicitly declared; in both instances, the trigger loses its default behavior. 

Restoring the default behavior by using a semicolon in the trigger no longer works. During compilation, Uniface 10 will create a trigger that does nothing when executed. It still breaks inheritance but no longer restores the default behavior.


; this trigger breaks inheritance 
; but does not restore the default behavior
; in Uniface 10
trigger detail   
; 
end ; trigger detail

The only real solution to revert back to the default behavior is to convince the compiler to ignore all explicit behavior. For this purpose, we introduced the ‘undeclare’ keyword, which instructs the compiler to skip the compilation of the explicitly defined triggers, thereby restoring the default behavior:


undeclare trigger detail

Triggers can only be undeclared after they have been declared. When a trigger is undeclared before its declaration (or when it does not exist at all) the compiler will ignore the undeclare and issue a warning in the messages tab..


; first declare the trigger 
trigger detail  
  askmess "detail trigger"
end
; then undeclare it
undeclare trigger detail

Undeclaring inherited triggers

A trigger that a component inherits from a modeled entity or field can be undeclared in Script container of the modeled object or in the Script container of corresponding derived object in the component. 

 The Script container of a modeled field contains the following code:


trigger detail
  askmess "detail trigger"
end

The Script container of that same field on a component


undeclare trigger detail

This will result in the removal of the modelled detail trigger and will restore the default behavior (if it has any).

Undeclaring operations and entries

It is also possible to undeclare operations and entries:


operation myOperation
   ; some code
end ; myOperation
undeclare operation myOperation

entry myEntry
  ; some code
end ; myEntry
undeclare entry myEntry

For operations, the same rule applies as for triggers; they can be undeclared in the modeled or the derived object, but it must be done in same code container. For instance, if an operation is declared in the Occurence Script container on the model level it can only be undeclared in the same container on either the model level or the component level.

For entries it works differently. Since entries have a component scope, the entry can be undeclared in every container that is compiled after the last time the entry was declared. This requires a deeper understanding of how the compiler works.