Tag Archives: Cloud

Picking up on the latest and greatest on Microsoft’s Azure Platform

I recently attended Microsoft’s tech summit, held at Amsterdam’s RAI convention centre. For those of you who know me, my computing background is on the other side of the spectrum with predominantly UNIX and Linux derivatives. This was my first Microsoft event ever so it was with great anticipation and somewhat uncertainness that I attended the keynote.

From the word go it was clear that Microsoft is heavily vested in Cloud Technologies with customer stories from the Dutch Railway (Nederlandse Spoorwegen) who use Azure’s Big Data platform to predict when train components are about to fail, before failing and causing unnecessary disruptions. Abel Wang proceeded to guide us through a demo using Azure which would predict crime hotspots in certain areas around Seattle. Very impressive all of it.

The main reason however for attending the conference was to pick up on the latest and greatest on Microsoft’s Azure Platform. Microsoft Azure holds second place in the Cloud provider arena but, did experience the biggest growth compared to other players over the last year. Here at Uniface we already use Azure daily, the goal was to see if there were ways to better utilise Azure’s IaaS and PaaS offerings.

From all the Azure and Application Development sessions I learned a lot more about Azure’s PaaS offerings. In the ‘Protect your business with Azure’ session it was evident that Microsoft is fully committed to security and availability. By far, one of the most interesting sessions was ‘Building Serverless Applications with Azure Functions’ in fact. The session demonstrated how simple it is to run a basic event driven application without vesting any time in infrastructure or PaaS offerings.

All in all, the Tech Summit was a great success, I learnt a lot and will be applying the knowledge on workloads we execute in Azure.

Attending a cloud infrastructure training – A truly AWSome Day in Amsterdam

Last week I attended, along with a few other Uniface software engineers, the AWSome Day Amsterdam event, organized by Amazon Web Services (AWS) – the world’s largest provider of cloud infrastructure services (IaaS). The event was a one-day training in Amsterdam delivered by AWS technical instructors. More than 300 (maybe even 400) people attended the event. It was very crowded, but a very well-organized event.

From Uniface, a few people from the cloud, mobile and security teams attended the event, each with their own project in mind.

The interactive training provided us with a lot of information about cloud deployment, security and usage for the web and mobile environments. The focus was on AWS as a provider of cloud infrastructure services. In a nutshell, technical instructors elaborated on the following:

AWS infrastructure with information about the three main services they offer:

  1. Amazon Simple Storage Service (S3) to store objects up to 5 terabyte in multiple buckets. This service includes advanced lifecycle management tools for your files.
  2. Amazon Elastic Cloud Compute (EC2) which offers virtual servers as you need. EC2 has advanced security and networking options and tools to manage storage. Also very interesting, you can write your own algorithm to scale up or down to handle changes in requirements or spikes in popularity, to reduce costs and improve your efficiency.
  3. Amazon Elastic Block Store (EBS) which provides persistent block-level storage volumes that you can attach to a single EC2 instance. Interesting is that EBS volumes persist independently from running life of an EC2 instance. You can use EBS volumes as primary storage for especially data that requires frequent updates and for throughput-intensive applications that perform continuous disk scans. EBS is flexible, in the sense that you can easily grow volumes.

 AWS Event

During the event we discussed extensively the security risks, identity management and access functionalities. But also the usage of different databases (SQL vs NoSQL) together with the cloud services. Interesting topics discussed at the event were concepts such as Auto scaling of EC2 instances, Load Balancing, and management tools such as CloudWatch and AWS Trusted Advisor, which seems to be very useful to track security and costs issues.

Uniface Attending AWS Event

In general, the event has broadened my view on cloud deployment using AWS, but also using other cloud infrastructure services as the same concepts can be applied to other cloud providers. 

It was truly an AWSome Day in Amsterdam!

Uniface Training Modules offer more Flexibility

With the development of faster and even better Uniface software there was clearly a need for better and more flexible and efficient Uniface education and training. With the release of Uniface 9.7 the training materials were revisited, redesigned and partly redeveloped. The input from many Uniface consultants during and after the train-the-trainer session, conducted in October last year, seemed invaluable.

The training materials have been developed in a more modular way to even more meet the needs of our customers and enable a more flexible delivery. Three tracks have been defined. The courses can be delivered as classroom training or over Web, using the CloudShare® platform.

This blog briefly describes the available tracks and modules for these trainings.

After having successfully completed the two days Uniface Essentials, there are three options. Each option takes three days to be completed.

Uniface Training Modules

  • The Uniface Essentials training focusses on the model-driven and component-based development and will equip students, new with Uniface, with the necessary basic skills to develop software applications with Uniface. Students will be prepared for the next module. The Uniface Essentials module is a prerequisite for the Uniface Mobile, Uniface Web, and Uniface Client Server training.
  • With Uniface Mobile students will learn to develop responsive applications that can be deployed on mobile devices and tablet computers. Attention will be paid to some supportive frameworks for building responsive applications.
  • In the Uniface Web development class students are taught how to develop Uniface applications by building Dynamic Server Pages for the Web. All aspects of stateless software development are covered in this course. Some attention will be given to HTML5, and CSS3.
  • Uniface Windows Client means building application for the Windows platform.

For each module students are encouraged to make a number of exercises to become more acquainted with the specific topics covered in the training modules. There will be enough time to ask questions, for discussion, and the exchange of ideas and information to optimize the learning process.

Besides these trainings, where students will learn the basic skills, more advanced topics and techniques can be covered in custom made trainings. These customized training are delivered on customer demands only, and can be geared toward specific customer situations.

For questions, comments or remarks about Uniface training please contact uniface.training@uniface.com, or download this fact sheet with more information.

GDG DevFest, Amsterdam Edition

On October 10th 2015, Google had organized the biggest Google Tech related event in The Netherlands, located in Science park, the heart of science in Amsterdam. Uniface as one of the sponsors was invited to join the event. As the representing group of Uniface, the mobile development team attended the Google DevFest, an event carefully crafted for the developers by the Dutch GDG communities.

Uniface Mobile Dev Team
Uniface Mobile Dev Team

GDG DevFests are large, community-run events that can offer speaker sessions across multiple product areas, all-day hack-a-thons, code labs, and more. Google Developer Groups (GDGs) are for developers who are interested in Google’s developer technology; everything from the AndroidChromeDrive, and Google Cloud platforms.

A GDG can take many forms — from just a few people getting together to large gatherings with demos and tech talks, to events like code sprints and hackathons. However, at the core, GDGs are focused on developers and technical content, and the core audience are mainly developers.

Each GDG DevFest is inspired by and uniquely tailored to the needs of the developer community that hosts it. DevFest 2015 also had a series of speaker sessions and workshops for web, mobile and cloud solutions, glimpses of the sessions were about –

        Material coordination –a new design support library by Google, which helps bring a lot of material design components including a navigation drawer view, floating labels, floating action buttons, snack bars, and a new framework to tie motion and scroll events. The library is supported for Android version 2.1 and higher.

        Firebase : codeless backend for Android – Firebase is a cloud services provider and backend as a service . Firebase’s primary product is a realtime database which provides an API that allows developers to store and sync data across multiple clients.

        Google Cloud Endpoints and AngularJS -Google Cloud Endpoints consists of tools, libraries and capabilities that allow you to generate APIs and client libraries from an App Engine application, referred to as an API backend, to simplify client access to data from other applications. Endpoints makes it easier to create a web backend for web clients and mobile clients such as Android or Apple’s iOS.

        UI Router – It’s a routing framework that allows us to organize the interface by a state machine, rather than a simple URL route.

        Polymer – library makes it easier than ever to make fast, beautiful, and interoperable web components.

 Last but not least, and most importantly, being a sponsor, we also had an opportunity to provide a short pitch and say something about Uniface to the developer groups. The pitch was very nicely outlined and well presented by Thomas Stolwijk.

Uniface
Uniface Sponsor Pitch

Modelling: Essential Not Optional (Part 2)

By Ian Murphy, Principal Analyst and Bola Rotibi, Research Director, Creative Intellect Consulting

Read Part 1 here.

Complexity is inherent in our IT DNA

One of the goals of IT for decades has been to reduce the complexity of the systems it writes and maintains. There are several reasons for this. Users want solutions faster, budgets are shrinking and complexity fuels failure.

Agile development, automation, Cloud computing and DevOps are all helping IT deliver applications faster and at a lower cost. This is positive news for the business. But what about the rising issue of complexity?

Unfortunately, complexity is inherent in IT systems that are used to run businesses. Stock control needs to be integrated with sales order processing which in turn is integrated into accounting systems. Call centre teams need access to these same systems to deal with customer queries. Online shops must be able to create new customers, display stock levels, take orders, and pass data to fulfilment systems. These are just some of the very basic systems that companies use.

We are now in a mobile world where applications are now required to run in web browsers or be written for multiple operating systems and classes of devices. These devices are not owned exclusively by the business instead, they are increasingly the property of individuals.

This means that any applications deployed on the devices is not just running in the context of a controlled environment but has to coexist alongside other applications that IT has no knowledge or control over. The end result of this is an incredibly complex set of security and performance issues that IT cannot know yet has to write solutions to deal with.

A further complication is that security is a constant challenge. The rise of malware, the ability of hackers to penetrate systems, seemingly at will, the risk to corporate data and the surge of compliance requirements is seemingly never ending.

Modelling has a new relevance

There is a new relevance for modelling in IT systems. Let’s take the example of an application designed to help an insurance sales team.

The requirement from the sales force is that they want an application that runs on their tablets and smartphones, that is capable of validating user details and can help deliver quotes, on the spot, that customers can sign up to.

From an IT perspective the operating system is unknown. The local storage and security capability of the devices are unknown. The application needs to integrate with customer systems which means they have to do data validation at point of entry. Information gathered needs to be risk assessed in order to create a meaningful policy and payment schedule. If there are potential problems, the application needs to be able to pass all the data to an underwriter in order to get a response.

This is just a quick list of potential issues and at every point there will be integration with other systems and the need to pass data around.

A computer model of this system might be very simple to begin with. Mobile device connecting to customer system, check for existing or new customer, data validation required, policy risk assessed and then payment schedule set.

This simple model enables key areas to be highlighted for further investigation. For example, does this have to be real-time? What performance speed is required? Can it be done over 3G or does it need a WiFI link? How long does it take to validate customer details? What happens if an underwriter is needed to make an assessment? How many users can the external gateways support at any point in time?

In short, the model encapsulates the five key points that models in general must deliver in order to be effective: In short, the problem has been abstracted to a mobile device connecting to core systems. Understanding is achieved by all parties because the abstraction is clean and contains just enough detail to see where potential problems could occur. The model is accurate because it describes exactly what is needed and the key steps that are involved. The identification of the external gateway as a bottleneck and the time required to carry out key tasks means that predictions can be done. Finally, there has been little to no cost at this point in establishing the model.

This is an overly simple example of a system with limited integration points but it demonstrates how quickly a model can begin to highlight areas of concern and how they can be further addressed. There would be no reason why the data validation couldn’t be modelled in more detail to understand what was being gathered and how it would be validated. The same is true of the process that creates the policy and determines the payments.

Modelling: relevant and crucial for Cloud computing

One of the major impacts on the IT landscape has been the arrival of Cloud computing. Systems may exist in a private cloud, a public cloud or be split over the two in a hybrid cloud.

In all three cases there is a need to understand how an application will be architected to take advantage of the capabilities that Cloud computing offers. Six key questions surrounding any application deployment to the Cloud are:

  • Where will application components sit?
  • Where will data be stored?
  • What is required by data protection and compliance laws?
  • What level of performance and scalability does Cloud provide
  • What security and encryption will be used?
  • What cost savings do the different cloud models offer?

Modelling allows companies to begin to address all of these questions. At the very basic level it will show application components and highlight potential integration challenges. For data, it will enable compliance teams to determine whether the company has a legal problem. Security teams can begin to identify what is needed to meet corporate security needs.

Without a model, a lot is taken on trust and people fail to properly identify challenges. Many companies are beginning to realise that there is far more complexity in moving applications to Public and Hybrid Cloud than they would ever have realised. A model would enable them to not only see what was moving but then enable subject matter experts to ask questions about integration, security and suggest what further and detailed models are required.

Model or be damned

There is no excuse for not modelling IT systems and in particular software developments. The five stages are clear and easy to use.

The key is in keeping it simple, using models to explore potential challenges and not over complicating things. Many organisations will ultimately discover that they don’t need a new model for every application and system because the similarities at the model level are very high. For example, mobile applications share a lot of common elements. Where they differ is at the accuracy and prediction stages.

Those companies that use models will identify problems sooner, reduce cost, and understand complexity. They will also open up opportunities for greater reuse and flexibility. In an age where business agility is paramount, modelling enables a company to deliver what users want, faster and with less risk.