In a world where technology is a part of everyday life and part of any businesses, choosing the right tool, the right technologic stack is critical to succeed and to achieve your aim. Information technology has to be your right-hand and has to be tailored on your needs, adapting to your organization and fits as best as possible to your workflows and communication structure (Conway’s law).
Special attention need to be focalized on data that flows into our systems and coming from varied sources. Correctly collect, transform and extract fresh information is the key. Despite other industries, IT industry has a big advantage; the input of the processes is not destroyed during the process itself, but it can be reused for further analysis and computations. This means we can potentially generated new meaningful information from the same dataset an infinite number of times.
In real world, each company has evolved and passed through different “informatic epoch” during which new applications has been deployed, new workflow created, new datatype exchanged, and where heterogenous systems coexist together. Moreover, the amount of data to be processed has drastically increased as well as the number of services.
The keystone has been and is INTEGRATION and SCALING!!!
The market offers, as response to this necessity, the Cloud, whatever is public, private or hybrid. Multiple offers from different vendors can be found and each one has its own benefits and drawbacks. I’m going to giving an overview of the possibilities offered by Microsoft Azure after spending a full-day at the event Microsoft Tech Summit in Paris.
Closer to clients and to business
Microsoft has recently announced that new 4 datacenters will be soon available in France, 3 in the region of Paris and 1 near Marseille. This choice is strategical. Being closer to clients means provides a higher level of service, for example by reducing the latency for fetching the data. Director from Engie said that the latency of accessing data is near 0 since their datacenters are next to Microsoft’s ones.
Having the data in a specific country is not only matter of speed but also a governance problem. Some businesses are strictly regulated and require that the client’s data must be stored in a specific geographical zone, replicated, auditable and secure. This is the reason for having 4 datacenters in France. Replication for data resiliency among the 3 datacenters in Paris and the fourth datacenter in Marseille to have a backup in case of major disaster in the zone of Paris. Service-level agreement (SLA) for data availability is four nines (99.99%).
Microsoft has invested around 3 billion dollars for creating new infrastructure in Europe and they have obtained multiple certifications for data storage compliance.
The 4 pillars
In a context where agility and response to events must be rapid having the right tool to work with and having infrastructure to support all the requests is important. With the cloud, the end-user, such as the developer, doesn’t care about the underlying infrastructure and the scalability of the system. He can stay focus on his task; creating software. They provide a paradigm like Java, “write once, run everywhere”. Developer can write code in C# then this code that can be deployed on multiple platforms. A real case example is UPS, which have used Microsoft Azure to rethink their business and make it more efficient.
Once the software has been written the next step is to have a simple way to create a test environment, test the software, aggregate the logs and monitor our services. Microsoft has done a great effort and a great job in this way by building up a user-friendly interface to create, instantiate and monitor services deployed in Microsoft Azure.
Another essential step is the communication between services and databases. They can belong to heterogenous systems and having different APIs and can be deployed on different type of clouds. Microsoft ends up with a really good solution to address the integration and having all those parties talking together; using an orchestrator event-based. The application name in Azure is Logic Apps. It exposes a large set of connectors to provide a simple integration with on-premises solution and with third-party software, include open source software. A simple and intuitive web interface allows the user to subscribe to event(s) and to associate an action to it. I have attended to 10 minutes demo that showed how to create a workflow for monitoring tweets by a specific tag, infer the emotional status of the tweet by using a build-in AI component and post the message to internal chat. All this has been done without writing a single line of code. Moreover, you can access to the configuration file of your Logic Apps schema and you can reuse it if want to automate your process. Very nice!
Another big piece in the puzzle of cloud is scaling. Microsoft has introduced in Azure ecosystem something called Functions which are serverless component triggered by an event. If you are familiar with AWS world you will find the same concept of lambda functions in Microsoft Functions. The Function is a portion of code executed when a specific event is triggered (ex. HTTP request, database operation, scheduled task, …). Those Functions can be instantiated multiple times and scale according to workload. The Function can be written in a multitude of different programming languages. From cost management point of view the client pays only when the Function is up and running.
It’s rare to find a company without an existent infrastructure or without any applications or databases. The adoption of a cloud solution need to pass through a phase of integration/migration with/from the current system. Microsoft strongly believes that hybrid is not a transient state but have to be a stable configuration. To simplify the partial or complete migration to cloud Microsoft exposes a set of tools.
Azure Stack is an application that the client can install in its own datacenter and which provides the same capabilities for allocating resources and instantiating applications in the same fashion of Azure.
Concerning data integration/migration Microsoft expose and interface for importing relational databases without too much work. Only SQL Server is supported for the moment. They are going to support soon also PostgreSQL and MySQL.
In terms of portability of licenses, the users that have already a contract for Windows Server or SQL Server can migrate their applications to the cloud with a cost saving of around 50%.
Hybrid cloud has been successfully implemented for the department of Val-d’Oise in France. The main constraint for staying in a hybrid configuration is to keep the data, citizens’ records, in on-premises solution.
Artificial intelligence is meaningless without data. Data integration and transformation is the first step of AI. Azure helps in collecting the data coming from different sources and from the world of IoT where the quantity of data is huge, and frequency of interaction is high.
To overcome to this problem Azure offers a service called Event Grid which is capable of handling 10 million of events per second.
Once the data is collected the analysis or the training can start.
A couple of build-in APIs already exist, and you can benefit of those to make a PoC or to explore AI world. During the live demo there was presented the OCR API and how to call it Python and a chat bot service for car insurance subscription. The bot can adapt itself to user’s language, identify the user via the voice or an image, detect the car model and analyze the feeling of the user based on the text.
“to believe that someone is good and honest and will not harm you, or that something is safe and reliable” (cf Cambridge dictionary)
Trust comes from human relationships and implies a strong level of agreement and honesty between both parts. This is truer when it comes business and that’s why Microsoft has chosen to have a different positioning on the market by proposing itself not only as vendor but as partner.
Trust in cloud domain means high level of SLA and a focus on the security. Microsoft has spent around 1 billion dollars in security and for protecting public and hybrid cloud from external threats. They have put in place several applications to protect the environment and to promptly respond to an attack. Windows Defender not only prevents the intrusions from happening, but also suggest you how to prevent them. In case of attack, it gets analyzed and the weak link is identified.
The replication and recovery of data will also give the opportunity to clients to protect them from Ransomwares that encrypt the files.
Trust is also being transparent on the cost!
Microsoft Tech Summit gave me the opportunity to explore Azure world by touching a little of each part of big and complex system like the cloud of Microsoft. From my point of view, developer, a lot have been done to simply the integration and speed-up development.
Keep in mind that a free plan is available for each service and you can start playing and have fun with it!