Category Archives: Blog

Uniface Security

Blog by Jason Huggins


The latest releases of Uniface 9 and 10 mark a significant milestone in the enhancement of security, both under the covers along with new functionality to secure applications.

I believe that in practice all organizations need to protect aspects of business confidentially, competitive edge, adhere to applicable privacy regulations and prevent data theft/manipulation. Protecting data is paramount for practically everyone. It can feel like the wild west at times, with attacks coming from all directions, for example an employee, a contractor/visitor, a cyber-criminal, malware/ransomware, accidental hackers, curious observers… the list goes on! Whether the data breach is internal, external, malicious or accidental, the risk should be understood, assessed and addressed.

The statistics of count, size and cost to a victim show that global data breaches have been on a continual increase each year. The current average cost of breaches is in the millions of dollars range, with total costs per year globally in the billions. Breach sizes have ranged from tens of millions of confidential records up to many billions of lines of data.

Predictions suggest that a clear majority of enterprise traffic will be encrypted throughout 2019. It is important for Uniface to support this, whilst making it an easy as part of the development and deployment platform to utilize.

What is the threat?

There are many threats to data security for which network security exposes a key flaw. There is an inherent weakness in the standard TCP/IP network model and IPv4 because none of the layers include security features as such. The protocols ensure very reliable transmission of data however do not fully ensure data integrity or confidentially.  It is extremely easy to sniff and tamper with the data in real time.

But wait, what about IPv6 you may ask? Well IPsec, a security protocol, is built into the IPv6 standard however it is not mandatory. The official launch of IPv6 was in 2012 however IPv4 is still the dominant protocol covering around 75% of deployments. The rate of adoption appears to be slowing however this does not in any way mean that IPv6 will not become the dominant standard, it may just take a little longer than expected. IPSec within IPv6 will not necessarily become the drop-in solution to the security hole. It is still valid to apply alternative or additional mechanisms to secure the transmitted data. The Uniface implementation means that the application can with ease, reliably ensure encryption is applied whatever the underlying IPv’N’ network infrastructure implementation and protocol support may be.

What’s new in network security?

Uniface now has a cryptography layer added to its network stack. The implementation is a TLS layer built on top of the standard TCP driver. The TCP driver itself has been refactored yielding several improvements. The new TLS driver utilises OpenSSL libraries. OpenSSL, often referred to as the

‘Swiss Army Knife’ of cryptography, is well maintained/supported, has excellent platform coverage and is backed by major organizations. It implements both Pre-shared key (PSK) and Asymmetric Certificate/Key pair verification, the later providing greater levels of security. The cryptography methods supported, called ciphers, are those of OpenSSL, however by default Uniface will only list the stronger ciphers.

The new driver encrypts the network traffic, including IPv6, between Uniface processes encompassing both shared and exclusive servers. A key feature supported by the TLS driver is ‘Peer Name Verification’, which helps mitigate compromises such as ‘Man in the Middle’ attacks.

Configuration is very straight forward matching the typical driver approach, with familiar mnemonics such as ‘tls:’ & ‘USYS$TLS_PARAMS’. The configuration and various possibilities are well documented in the help.


Security is a shared responsibility spanning development and operations. Being more of configuration exercise, developers will see little change. The extra processing needed to encrypt/decrypt may have an influence e.g. transaction size and client vs server processing could become a consideration. Note: Uniface benchmarks match the published OpenSSL results.

Operations should understand security, TLS and encryption, ensuring to pick ciphers that adhere to internal policies whilst maximising performance. The ‘pathscrambler’ is essential and must be used to safeguard the TLS configuration settings.

The TLS driver is simple to use and should be considered an essential priority for most.


Global Objects

There are many types of Global Objects, like Messages, Global Procs and Keyboard Translation Tables, to name a few. The Uniface 9 IDF and the Uniface 10 IDE provide editors to maintain the definitions of those objects. You could consider those as the definition-time source objects. Successful compilation of those source objects results in compiled objects, their run-time counterparts.

The compiled objects are static objects. User applications can use them, but they have no way of changing them. To change them, you must use the editors in the Uniface development environment (the version 9 IDF or the version 10 IDE) to change their source objects, then compile those and make the resulting compiled objects available to the application.

Counter – the oddball

There is one particular type of Global Object that is different from the others: the Counter. Contrary to other Global Objects, Counters are not static run-time objects. Any application can change them through ProcScript. The numset statement creates a counter or re-initializes its value, and the numgen statement increases a counter’s value. Considering this, you may even consider Counters as run-time data rather than run-time objects.

To maintain Counters, Uniface 9 offers the Define Counter dialog. This dialog gives the impression that, like for other Global Objects, it maintains source objects. However, it does not. In fact, there are no source objects for Counters. They only exist as run-time objects, in UOBJ.TEXT. The Define Counter dialog acts directly on those run-time objects.

If your application uses Counters, be aware of these aspects, and apply extra care when deploying UOBJ.TEXT. Also, users of the Define Counter dialog might just accidentally change the value of a Counter that is being used by an application.

Migrating Counters to Uniface 10

Uniface 10 is straightforward: it simply regards Counters as user data. The one and only way to change them is through the ProcScript instructions that are intended for just that purpose: numset and numgen. There is no dialog that can be used to adjust Counter values.

If you already initialize and maintain your application’s Counters purely by means of ProcScript logic, there is no extra effort involved for the migration of Counters to version 10. This logic will continue to work as it did in version 9.

If, on the other hand, you use the IDF’s Define Counter dialog to initialize and maintain your application’s Counters, you will need to act. To achieve the same functionality in version 10, you must implement that logic yourself, using the available ProcScript instructions. Also, you will need to apply the same care as you did in version 9, to make sure that UOBJ.TEXT is properly distributed and/or installed.

This example auto-increments a counter. If it does not exist yet, it creates it and gives it an initial value:

  ; Auto-increment counter

numgen “MYCOUNTER”, 1, “MYLIB”

if ($status == <UPROCERR_COUNTER>)

    ; Counter does not exist.

    ; Create it and give it an initial value of 1001.

    numset “MYCOUNTER”, 1001, “MYLIB”

    if ($status < 0)

      message “Error generating sequence number.”

      rollback “$UUU”



      SEQNO = 1001

      commit “$UUU”



SEQNO = $result

commit “$UUU”



Blog: Frank Doodeman
Frank is a Software Architect at Uniface.

Uniface 10.3: the version to go for

In case you’ve missed the summer’s exciting news from Uniface headquarters, Uniface 10.3 has now arrived.

I’ve already been working with this new version for a while, initially using a couple of pre-releases, but then for the past few weeks the live release. This experience has convinced me that Uniface 10, and version 10.3 in particular, is the version the Uniface community has been waiting for. I’m writing this blog post to explain why, and especially to share my experiences with the new IDE.

Background: Uniface 10

Uniface 10 was designed and built based on the wishes of the Uniface developer community. Hundreds of questions and requests from Uniface developers all over the world were taken into account during this extensive design exercise.

The result is a complete overhaul of the Uniface development environment. Uniface 10 has a whole new look and feel, comparable with any modern IDE.

Although it’s still recognizably Uniface, developers may need a little time to get used to the new version, but in my experience that will be time well spent. There’s no way I’m going back to Uniface 9!

A major difference from earlier versions is that Uniface 10 is a non-modal development environment, which means you can work with as many Uniface objects as you like in parallel. Being able to switch between components with just one click makes development easier and more efficient. This by itself is a great reason to start using Uniface 10.

Highlights of Uniface 10

Here are some of the enhancements that you’ll notice immediately when you start using Uniface 10 for the first time:

  • The IDE’s performance has significantly improved, making the non-modal concept a pleasure to work with.
  • The graphical form painter functionality is drastically improved – a strong argument for client/server developers to switch to Uniface 10.
  • Debugging is faster: every error and warning message in the compiler output contains a hyperlink to the relevant line of code.
  • There’s a completely updated and stable meta-dictionary so developers can safely port their existing custom-written utilities to Uniface 10. The additional menu items in previous Uniface versions can be used to launch these utilities.
  • Uniface 10 now also has user-defined menus and user-defined worksheets. My experience shows these are very powerful. Yes, you might need to modify your tools, but again it’s worthwhile.
  • The new Transport Layer Security (TLS) network connector makes the network connection between client and server secure – vital for business-critical applications.

I’ll discuss many of these enhancements in more detail in future posts.

As well as all these major improvements, Uniface 10 brings some smaller “nice to haves”. For example, I’m pleased to have the option to set the title bar text of the IDE application window.

Migrating to Uniface

The migration process from Uniface 9.7 to Uniface 10 has been run by the Uniface team over and over again. Many huge Uniface 9 customer applications have been migrated successfully to Uniface 10. So for those currently on Uniface 9.6 or 9.7, migration is likely to be a smooth process.

If, on the other hand, you are currently considering migrating to Uniface 9.7.05, my advice would be to move directly to Uniface 10 instead because of the advantages described above and (This is also Uniface’s advice) it means one migration rather than two and ensures long-term support.

Conclusion: based on my experience, I believe Uniface 10.3 is the version to go for.

Blog: Peter Lammersma
Peter Lammersma is an entrepreneur and IT and business consultant. Peter works extensively with Uniface 10. As a long-serving member of the Uniface community, he’s kindly agreed to give his independent perspective in this blog series.

Do I need to compile this?

Over the years many Uniface developers have created tools on top of the Uniface Repository.

One tool that has been made by many, is one that looks for “dirty” objects: objects that were modified after they were last compiled successfully.

In Uniface 9 such a tool would have been based on comparing the fields UTIMESTAMP (modification date of the source) and UCOMPSTAMP (compilation date of the source) of various Uniface Repository tables.

In Uniface 10 this has changed, mainly to align the repository with the export format that has been optimized for version management:

  • The modification date of a development object source is only found in the main object. And yes, it is also updated when you change a sub-object. So if you change a modeled field, the UTIMESTAMP of the entity is updated.
  • The compilation date of a development object is no longer an attribute of the source code. It does not have much value to know when the source was last compiled, if you can’t be sure that the compiled object was the result of that compilation. Someone may have copied a different compiled object to the output folder. The only real compilation date is that of the compiled object (file on disk or in a UAR).

Uniface 10.3 is the first release of Uniface 10 that is shipped with meta definitions: the new DICT model is published. So now you can re-engineer the tools that you made for Uniface 9.

In order to make it easier to (re)write a tool, the $ude(“exist”) function has been enhanced to return the attributes of the compiled object (file on disk or in a UAR) such as the modification date.

Compiling objects because their source code has changed

It is not just components that require compilation. There are 14 types of development object that require compilation and generate a file in your resources.  I have attached a sample tool that checks whether these objects are “dirty” and therefore require compilation.

The tool gathers the source modification date from main development objects, and the compilation date of the compiled objects. In some cases, one main development object (such as a message library) results in many compiled objects (messages).

The tool uses $ude(“exist”) to check the compilation timestamp of the compiled object and $ude(“compile”) to compile it.

The attached export file contains a full project export, so when you open project WIZ_COMPILE, you will see the whole thing.

You can download the export here: Feb 06
And here is a file with some test data for each object type: Feb 06

You will need the Uniface 10.3 DICT model to compile the tool. The new DICT model for Uniface 10.3 is delivered in umeta.xml in the uniface\misc folder of your development installation.

PLEASE NOTE: The sample tool does NOT take into account that a component may require compilation because a modeled entity or an IncludeScript has changed. See below.

Compiling components because a modeled entity has changed

Please note that the attached sample does NOT check if a component requires compilation because a  modeled entity has changed. If you had this check in your Uniface 9 tooling, you also need to implement it in your new tooling. 

A Uniface 9 based example for this issue can be found here:
You would need to simplify this for Uniface 10 because the modification date is only on the main development object.

Compiling objects because an IncludeScript has changed

Please note that the attached sample does NOT check if a development object requires compilation because an IncludeScript has changed.

To implement this is quite tricky, as you would have to find the #INCLUDES within the IncludeScript code, and handle the fact that they can be nested. To correctly parse all of that might not be much faster than just compiling everything…

AI First

Focus on the right stuff

For a long time ‘mobile first’ was our software developers paradigm. Every new application should not only take the mobile user into account but also focus on mobile use as the primary device. Nowadays, Artificial Intelligence, (AI) is also a subject matter. But what does it mean for developers, and what happened to mobile?

AI first, mobile second?

Mobile is not forgotten. The ‘mobile first’ paradigm was necessary to make the step from desktop to mobile and adapt features of mobile devices. Since mobile devices were put in first place it’s common to design and build from a mobile perspective. Mobile is the new normal, just like desktop was in the 90’s and web (still on desktop btw) in the zero’s.

Artificial intelligence (AI) is not new. This in contrast to mobile. Twenty years ago we could only dream about the mobile revolution. What we find normal these days most visionaries didn’t predict a decade ago. But AI is something from the mid 50’s. And from far before that. It’s been in your minds for generations, machines behaving like humans.

Artificial General intelligence (AGI)

There are roughly two categories of AI: applied or general. The last one, Artificial General Intelligence, is used for the general purpose systems that more or less behave like the human brain. These systems can be even better (whatever that might be) than the Human General Intelligence. This is what we think about when we talk about AI. But it’s elusive.
Have you seen the movie Her? About a man falling love with his Operating System. This is what we think AI is or should be. Thinking like a human, but without the disadvantages of the human brain (the need for sleep for instance). But it is also the image that scares us most, isn’t it. Computers and robots taking over and making us, humans, superfluous.

Sometimes it/IT looks like magic. Do you remember this magician David Copperfield making the Statue of Liberty disappear? We all knew it was an illusion (although we didn’t know how he did it). In 18’s century an automated chess player was invented. A machine that could play chess. Turns out, there was a tiny chess player inside this machine.
In the 90’s the development of neural networks where very popular. Computers programmed to behave and learn like the human brain. Finally we had real AI! Very promising, but about a decade later we learned about big data. Why predict the future if you can calculate it?! Google could predict the flu based on the search result. That’s what going on these days. And that’s what makes IT an interesting playground for Uniface.

Artificial Applied Intelligence (AAI)

Systems that replicate specific human behavior or intelligence. It varies from old fashioned fuzzy logic (like the controller of your central heating system and the PLC’s the control the traffic lights in your city) to Machine Learning (you wished the traffic lights were controlled by). It all might look like it’s a kind of intelligence, but most of the time it’s something the developer more or less created. But the reaction started by a certain action is depending on previous results: “Last time you, the user, where satisfied when I did this after you did that, so I am going to do exactly the same.” There is nothing magical about that. That’s combining data, computing power and a bit commonsense.

An example where I wish the developer did use AAI. On my phone I have a public transport app. When I type the name of the street where I want to travel to, it shows me all the cities with this street. Of course I can start typing the name of the city, but I want the system to know where I want to go. Since I always use this system within my own city. I expect the system to learn how I use it and show the street in the city where I am on top of the list.

Big data and sensors

Tesla knows how to use AAI. Their autopilot it mostly depending on AAI. Every time something unexpectedly happens the car communicates this to a centralized system. The system learns by comparing the specific situation, the performed actions and the results of these actions.
In fact, it’s relatively easy. Only thing the system has to do is decide if the current course is safe. Constantly monitoring all real-time input. On the internet you can find videos about Tesla’s predicting a collision and taking proper actions to prevent it. The autopilot stopped the car, as it should. From a human perspective not a big deal, this is what our brain is doing constantly while we are awake (and even in our sleep). And that is exactly what AAI is all about, replicate a specific part of the human intelligence.

Modern times

Most ‘old fashioned’ developers and probably even the organizations they work for, still want build software that is hardcoded van A to Z. That makes the development process manageable and testable. Software that is supporting the business processes of yesterday.
Nowadays users expect software to think with them. Software that supports their wishes and demands of tomorrow. Within a few years they expect their systems to think for them! In modern software development AI must kept in mind. Not every situation can be programmed nor tested. It is not a developer thinking about every possible situation. Software is more then a long list of ‘if then’ statements; it’s less.

All it takes, is a database with all possible situations and actions. Every new situation is added as soon as it occurs, updating this system on every possible occasion. The heart of the system consists of algorithms that determine which action is the best option given a certain situation.

This is how a chess playing computer beats a human grandmaster without cheating like the machine mentioned above: by playing (and winning and loosing) over and over again and learning from it.

And this is how a robot learns itself to walk: by walking, and falling and standing up over and over again.

Another example where I want to have more intelligence is my calendar. When I have an appointment I want my calendar software to tell me when I should leave to be on time. Based on my current location, the means of transportation, the traffic, my behavior (I walk fast, but leave always just too late), etc. And I want the software to warn me when a new appointment endangers my schedule for that day.


What makes a programming language suitable for AAI purposes?

• AAI is about data. Some of the data is static and stored in a database. With Uniface we can build data intensive applications. That’s where Uniface’s is designed for. It’s technology independent, scalable and very stable.The Uniface programming language is optimized for reading data from and storing it into every common database system.

• AAI is about using sensors. Not all data is (relatively speaking) static, some is realtime from sensors or user input. The Progressive Web Apps built with Uniface can use every hardware feature on mobile devices. And Uniface can even be installed on devices like Raspberry Pi and use every sensor attached to the system.

• AAI is also about user input. Uniface supports a wide range of user devices. From the old fashioned desktops to mobile apps on a smartphone.

• AAI is about computing power. Applications build in Uniface can be deployed on every mainstream OS. The coding is interpreted efficiently.

• AAI is about building clever algorithms. Developers don’t have to worry about OS and database specifics. So they can focus on writing clever software. Building algorithms is something every developer loves!

That sounds ideal for Uniface. And it is! I am very curious about your first AAI applications!

For samples, tools, add-on’s, blogs and more, visit