brand-logo
  • STRENGTHS
  • 540° LIVE SOLUTIONS
  • THE TEAM
    • FIVEFORTEAM
    • PARTNERS
  • WHAT'S UP
    • TECH
    • MAG
  • INSIDER
    • THE 540° CLUB
    • POSTCARD
  • WE'RE HIRING!
    • GET CLOSER TO US!
    • OUR OFFERS
  • CONTACT

Open Source

of satisfaction

 

Solutions architect and partner at FiveForty°,

Jérôme Piquot shares his vision of Open Source

with us and sheds light on the distributed

applications he is working on.

A little bit of inventory ...

FR
EN
FR
EN

Open Source

of satisfaction

Solutions architect and partner at FiveForty°, Jérôme Piquot shares his vision of Open Source with us and sheds light on the distributed applications he is working on. A little bit of inventory ...

How long have you been involved in Open Source projects and how did you come to this principle of collaboration?

JP : I started in the early 90s, in what we then called news groups. At that time, only a small number of people had access to the internet, and of course, available information was limited to say the least. A need for sharing was emerging. We made friends with people through discussions to settle problems, or who had similar goals. The first developer communities are born of this willingness to do things together.

Over the years, can you give some examples of the work you have done? 

JP : My first Open Source project concerned Gedimat on a management and invoicing software for the wholesalers. It revolved around orders, delivery paperwork and invoicing. A kind of very modest ERP embryo that did not manage more than sales. The Open Source part was a web application written in C ++. HTML screens were generated using XSLT. The prints, in PDF format, were made via XSL-FO and Apache FOP.

Then came POS, also in Open Source. This management module for cash registers did not exist at the time, nowadays, it is  integrated in your Microsoft solution. Did it use technologies that you still offer today?

JP : I've always been a bit of a techno geek. It is true that I was ahead of the times and therefore these technologies  are all current today. In particular in AX which used to use XSLT. Then I returned to Open Source for Lagardère. Today, I am moving forward on a new area to manage the concept of distributed applications.

In two words?

JP : At the moment, applications can only be run on a single processor. And that's way too slow. We must completely change our paradigms of development so that the application, when it runs, can do this on all available processors, at the same time. A classic application, if it has 100 additions to make, it will calculate each addition one after the other on the same processor. A distributed application will run all 100 additions at the same time, on 100 processors simultaneously. It seems simple to implement for a calculation but becomes much more complicated when processing the data that is itself distributed. This imposes completely different programming models, such as the use of the actors model.

You have also published in Open Source a management tool For SQL backups. Finally, what kind of feedback do you get from these experiences?

JP : A certain idea of sharing. When developing, one makes great use of the community’s work. In return it is natural to do something on your own to thank all the people who have helped you. You have to give back to others what saved you time for them to go faster too. Open Source takes a lot of your personal time. A time that one don't always have. It’s a state of mind.

Sharing is a core value of FiveForty°, can you see in it

a link with the principle of Open Source?

JP : FiveForty° is based on values, even moral ones, that we find in this community of individuals who make up the company. We takes advantage of a community and we give back  to it by your activity. We take and we give back, that's the logic of the Open Source which is much more than just code that we share on the Internet. It is also about know-how, the exchange of ideas, experiences, mutual assistance. So many concepts at work at FiveForty°, that's true.

Both in a professional and personal capacity, what free software do you use?

JP : CMS like Orchard Core, a website creation tool initiated by Microsoft. IdentityServer4 to manage authentication. But also Orleans and Dapper, frameworks to manage the actors and to create distributed applications. The other key piece of Open Source is the reporting of bugs. In this context, .Net Core is a good reliability tool. This framework has been 100% Open Source for three years. It reflects a major change at Microsoft that is starting to get away from a pure proprietary logic by opening the codes of its software infrastructures. With the exception of Windows.

What changes do you notice and what future do you think for Open Source?

JP : The advantage of Open Source is also measured in the hybrid world of ours. Barely fifteen years ago, Mr. Everybody was carrying out his little project in his corner with a reduced community. Today the Open Source projects have changed their dimension and are very often conducted by a software publisher. Speaking of incidents, we can now go and correct the publisher's code directly, and this code, once validated, will be integrated into the solution. It's a real exchange process between the users and the publisher. The editor does not lose anything, it also gives cohesion to the whole. He remains a winner with much more reliable products thanks to a huge user testing base. More generally, Open Source is an asset insofar as it allows very significant cost sharing. When twenty-some companies use an Open Source publication, the community maintains it and multiplies its performance. The result is a tool of power and remarkable agility where the short life cycle of an evolution can be treated in continuous integration. And the more of us, the faster we go.

Can you give examples of the impact of Open Source ...

JP : Google, with Chrome, made the product ultra-stable by letting the Chromium source code be Open Source. It gave it a technological advance that Microsoft could not follow with Edge. It was forced to make a new version of Edge based on the Chromium project. The publisher is essential to the overall strategic vision. For example, FireFox, which is independent but struggles to have a clear roadmap for its product, may gradually disappear. Note the leverage effect due to the editor in the Open Source and its typing power associated with the community which multiplies the tests. It generates also the contribution of ideas, improvements as for Orleans, optimized in performance by a Russian contributor. Dapper, started by Microsoft, which goes further, is a project that will replace Microsoft Service Fabric. Many new Microsoft products, start in Open Source and not necessarily with in-house technologies. Dapper uses Go, which is a Google language, for example. Simply because it is more suited to handle certain aspects.

You point out that today we are arriving, in terms of  Processor speed at the limits of physics with 5GHz. What consequences can be expected in the medium and long term?

JP : The consequence is that power can no longer be obtained just by the CPU, the central processing unit of the computer. It used to be easy. These problems were settled by replacing an old machine with a newer one that was twice as fast. Today, one must be be able to add the machines to each other. However, there is not 1% of current applications in the world who are able to handle such  horizontal elasticity. This implies that potentially, tomorrow there might be an overhaul of the ERP to consider. ERPs are large monolithic applications that manage very poorly parallelization of processing tasks. To face the volumetry of tomorrow and the real time requirement of the customers,  these monoliths need to be rewritten as distributed apps, organized in micro-services or nano-services (actors).

Are not some French companies being ossified by their IT systems?

JP : Sometimes some end up suffering it. To summarize, it takes them a year to add a single button! Often the applications are so coupled with each other within an IT  department, that touching any line of code in any of them, is a perilous exercise. A game of mikado whose domino effects are impossible to master. This is why all software architecture experts are recommending to decouple applications. This makes it possible to replace a single brick without any impact on the others. What has changed these last years, is that we applying this decoupling within an application itself. Each module of an application must itself be decoupled from the others. Thus allowing to make a module evolve without having to fear regressions in the rest of the application. The number of regressions is considerably reduced, maintenance is simpler, the changes are less complex to achieve and they are delivered in shorter timeframes.

This modularity makes it possible to architect its application in small scalable autonomous units (micro-services) which are easily deployable in an internal, cloud or hybrid infrastructure.

When a need for elasticity arises to manage peaks In demand linked to seasonal variations, this power issue can be addressed in minutes with tools like Kubernetes that deploy new resources and deal with peak workloads.

But one must be elastic also in the other direction: be able to reduce resources when needs decrease. Now, the technologies make it possible to plug cloud machines on the internal data center. The classic ratio, 20% cloud and 80% internal on-premises, is a reasonable  hybridization for the company that wants to arm itself against variations to come.

Finally, how is your collaboration in Open Source  projects a plus for FiveForty° customers?

JP : In this universe, advancing at a forced march towards openness, having a partner who aggregates innovation with the open source mindset to understand the progress of the new technologies allows the customer to follow these developments in the Dynamics world with greater peace of mind.

Interview by J. Lascaux,

Founding partner of FiveForty°

Share this article:

Facebook Linkedin twitter mail

How long have you been involved in Open Source projects

and how did you come to this principle of collaboration?

JP : I started in the early 90s, in what we then called news

groups. At that time, only a small  number of people had access

to the internet, and of course, available information was limited

to say the least. A need for sharing was emerging. We made

friends with people through discussions to settle problems, or

who had similar goals. The first developer communities are

born of this willingness to do things together.

Over the years, can you give some examples of the work you

have done? 

JP : My first Open Source project concerned Gedimat on

a management and invoicing software for the wholesalers.

It revolved around orders, delivery paperwork and invoicing.

A kind of very modest ERP embryo that did not manage

more than sales. The Open Source part was a web application

written in C ++. HTML screens were generated using XSLT.

The prints, in PDF format, were made via XSL-FO and Apache

FOP.

Then came POS, also in Open Source. This management

module for cash registers did not exist at the time,

nowadays, it is integrated in your Microsoft solution.

Did it use technologies that you still offer today?

JP : I've always been a bit of a techno geek. It is true that I was

ahead of the times and therefore these technologies are all

current today. In particular in AX which used to use XSLT.

Then I returned to Open Source for Lagardère.

Today, I am moving forward on a new area to manage the

concept of distributed applications.

In two words?

JP : At the moment, applications can only be run on a single

processor. And that's way too slow. We must completely

change our paradigms of development so that the application,

when it runs, can do this on all available processors, at the same time. A classic application, if it has 100 additions to make, it will

calculate each addition one after the other on the same

processor. A distributed application will run all 100 additions at

the same time, on 100 processors simultaneously. It seems

simple to implement for a calculation but becomes much more

complicated when processing the data that is itself distributed.

This imposes completely different programming models, such

as the use of the actors model.

You have also published in Open Source a management tool

For SQL backups. Finally, what kind of feedback do you get

from these experiences?

JP : A certain idea of sharing. When developing, one makes

great use of the community’s work. In return it is natural to do

something on your own to thank all the people who have

helped you. You have to give back to others what saved you

time for them to go faster too. Open Source takes a lot of your

personal time. A time that one don't always have. It’s a state of

mind.

Sharing is a core value of FiveForty°, can you see in it a link

with the principle of Open Source?

JP : FiveForty° is based on values, even moral ones, that we find

in this community of individuals who make up the company.

We takes advantage of a community and we give back to it by

your activity. We take and we give back, that's the logic of the

Open Source which is much more than just code that we share

on the Internet. It is also about know-how, the exchange of

ideas, experiences, mutual assistance. So many concepts at

work at FiveForty°, that's true.

Both in a professional and personal capacity, what free software do you use?

JP : CMS like Orchard Core, a website creation tool initiated by

Microsoft. IdentityServer4 to manage authentication. But also

Orleans and Dapper, frameworks to manage the actors and to

create distributed applications. The other key piece of Open

Source is the reporting of bugs. In this context, .Net Core is a

good reliability tool. This framework has been 100% Open

Source for three years. It reflects a major change at Microsoft

that is starting to get away from a pure proprietary logic by

opening the codes of its software infrastructures. With the

exception of Windows.

What changes do you notice and what future do you think

for Open Source?

JP : The advantage of Open Source is also measured in the

hybrid world of ours. Barely fifteen years ago, Mr. Everybody

was carrying out his little project in his corner with a reduced

community. Today the Open Source projects have changed

their dimension and are very often conducted by a software

publisher. Speaking of incidents, we can now go and correct

the publisher's code directly, and this code, once validated,

will be integrated into the solution. It's a real exchange process

between the users and the publisher. The editor does not

lose anything, it also gives cohesion to the whole. He remains

a winner with much more reliable products thanks to a huge

user testing base. More generally, Open Source is an asset

insofar as it allows very significant cost sharing. When twenty-some companies use an Open Source publication, the

community maintains it and multiplies its performance.

The result is a tool of power and remarkable agility where the

short life cycle of an evolution can be treated in continuous

integration. And the more of us, the faster we go.

Can you give examples of the impact of Open Source ...

JP : Google, with Chrome, made the product ultra-stable by

letting the Chromium source code be Open Source. It gave

it a technological advance that Microsoft could not follow with

Edge. It was forced to make a new version of Edge based on the

Chromium project. The publisher is essential to the overall

strategic vision. For example, FireFox, which is independent but

struggles to have a clear roadmap for its product, may gradually

disappear. Note the leverage effect due to the editor in the

Open Source and its typing power associated with the

community which multiplies the tests. It generates also the

contribution of ideas, improvements as for Orleans, optimized

in performance by a Russian contributor. Dapper, started by

Microsoft, which goes further, is a project that will replace

Microsoft Service Fabric. Many new Microsoft products, start in

Open Source and not necessarily with in-house technologies.

Dapper uses Go, which is a Google language, for example.

Simply because it is more suited to handle certain aspects.

You point out that today we are arriving, in terms of

Processor speed at the limits of physics with 5GHz.

What consequences can be expected in the medium and

long term?

JP : The consequence is that power can no longer be obtained

just by the CPU, the central processing unit of the computer. It

used to be easy. These problems were settled by replacing an

old machine with a newer one that was twice as fast. Today, one must be be able to add the machines to each other. However,

there is not 1% of current applications in the world who are able

to handle such horizontal elasticity. This implies that potentially,

tomorrow there might be an overhaul of the ERP to consider.

ERPs are large monolithic applications that manage very poorly

parallelization of processing tasks. To face the volumetry of

tomorrow and the real time requirement of the customers,

these monoliths need to be rewritten as distributed apps,

organized in micro-services or nano-services (actors).

Are not some French companies being ossified by their IT systems?

JP : Sometimes some end up suffering it. To summarize,

it takes them a year to add a single button! Often the

applications are so coupled with each other within an IT

department, that touching any line of code in any of them, is a

perilous exercise. A game of mikado whose domino effects are

impossible to master. This is why all software architecture

experts are recommending to decouple applications. This

makes it possible to replace a single brick without any impact

on the others. What has changed these last years, is that we

applying this decoupling within an application itself. Each

module of an application must itself be decoupled from the

others. Thus allowing to make a module evolve without having

to fear regressions in the rest of the application. The number of

regressions is considerably reduced, maintenance is simpler,

the changes are less complex to achieve and they are delivered

in shorter timeframes.

This modularity makes it possible to architect its application in

small scalable autonomous units (micro-services) which are

easily deployable in an internal, cloud or hybrid infrastructure.

When a need for elasticity arises to manage peaks In demand

linked to seasonal variations, this power issue can be addressed

in minutes with tools like Kubernetes that deploy new

resources and deal with peak workloads.

But one must be elastic also in the other direction: be able to

reduce resources when needs decrease. Now, the technologies

make it possible to plug cloud machines on the internal data

center. The classic ratio, 20% cloud and 80% internal on

premises, is a reasonable  hybridization for the company that

wants to arm itself against variations to come.

Finally, how is your collaboration in Open Source  projects a

plus for FiveForty° customers?

JP : In this universe, advancing at a forced march towards

openness, having a partner who aggregates innovation with

the open source mindset to understand the progress of the

new technologies allows the customer to follow these

developments in the Dynamics world with greater peace

of mind.

Interview by J. Lascaux, Founding partner of FiveForty°

Share this article:

Facebook Linkedin twitter mail

Paris - FRANCE / New York - USA

contact@fiveforty-group.fr

©2021 FiveForty°. All Rights Reserved.

Legal disclaimer

Design and production:

GO BACK
GO BACK