Hello!

The history of formation and development is not unique to people. By perfecting ourselves, we are improving the things we do.

ITKarma picture

Our Bank is no exception, which over time has grown over a web of numerous IT solutions, in the center of which is the automated banking system Equation (ABS). My name is Kirill Dibrova. I am engaged in the functional and architectural development of Equation - a huge and very important software package for the Bank; not at all like typical implementations of a modern developer.

In this post I want to tell you about the forced "industrial revolution" that the ABS devops had to carry out. So...

Beginning of the journey


Once upon a time, in a forgotten 2002th, in a “distant, distant galaxy”, the developers of our Bank began to refine the “freshly introduced” and quite popular ABS Equation at that time. As it turned out, this is such a huge business monolith, spinning under the IBM mainframe, called at that time, AS/400. And they used for this quite “modern” at that time IDE - terminal editor SEU (source entry utility)

ITKarma picture
SEU, the "modern" source editor for IBM i

Agree, now it looks like special effects in the “Star Wars” of the 78th year - severely and mercilessly.

Somewhere in the mid-2000s, IBM nevertheless had mercy on its developers and released Rational Application Developer for iSeries (later Rational Develop for I, or Rdi for short). Life has become more fun - this is a modern window interface, and quite convenient tools for interacting with the server, and syntax support for the main development language RPGLE. But, nevertheless, the code was still stored and edited within the framework of the specialized mainframe file system. And without a full history of changes and other important SCM features, working on code together was very difficult.

However, it is tolerant with a modest team of eight people.
Under such conditions, the development division entered the second decade.

Know Yourself


The bank was developing - the ABS development team was developing and expanding. When approaching two dozen developers, obvious problems began to emerge: finding the current code is quite problematic, simultaneous editing of the same file has become an insoluble task. A separate story was with assembly and deployment. After all, the application was installed with a script. And at the same time, developers often copied the necessary pieces of code from one "batch file" to another, forgetting to change them to fit their particular deployment.

Such incidental costs of a trifle could go unnoticed. But 2013 came, and there are already 50 developers. At the same time, everyone spends about an hour a week on their usual routine - assembling code for delivery, running debugging tests, "doc" and technical specifications. As it turned out, in a month these “little things” came together in hundreds of man-hours. And the devops-element finally came to our Bank.

Training


What did you want to do when we were given the task of “automating”?
Well, for starters, I wanted to get a single and, importantly, convenient source code repository. It is advisable with the experience of most developers, so as not to waste time learning. And cheers - at this time the Bank just deployed an Atlassian stack with a Stash (the shell for git, now called Bitbucket), under the gradual replacement of svn. In the wake of universal displacement, we too “drove” there.

Super! One Wishlist came true.

But there was one caveat - the texts still had to be copied to the server for compilation and deployment. When using RDi, this could be done without any problems. True, running a project at the same time in RDi and git is rather inconvenient. RDi works with the “native” server file structure and imposes its limitations on the project structure.
We tried iProject, which inherits this specificity. Yes, it allows you to work with sources on local machines, has built-in git support.But it is limited solely by the linear structure of the project, burdened with metadata for compatibility with the native IBM i file system.

In a word, it is full of all sorts of restrictions and in general is as inconvenient as working with SEU.

ITKarma picture
This is what RDi looks like - IBM IDE

We must look further.

The next wish is to make the deployment configuration unified. The main thing is with minimal specifics or even “without initiative”. I had to select a comprehensive tool for Continuous Integration.

There were no ready-to-use tools on the market. Everywhere you need to adapt, significantly spend money on initial adaptation and further support of CI.

Our requirements were as follows:

  • must be supported by git as a source repository;
  • there should be a tool to automate the process as a whole, such as jenkins or atlassian bamboo, or support the use of the above complexes;
  • the project should have a flexible project file structure a la maven from apache;
  • Delivery scenarios should be flexible and allow the maximum use of existing tools for compilation and deployment;
  • the toolkit should allow writing test scripts of different levels.

In general, all commercial products are supported by common tools. But the difficulties begin when it comes to the features of integration in the specific conditions of the organization. Products from key IBMi automation vendors ARCAD and UrbanCode unfortunately disappointed. First of all, the lack of flexible integration with our server. Adapting external solutions to the compilation tools we already used was either too expensive or not possible at all.

Having appreciated the difficulties, prospects and necessary investments, we decided to go our own way - to build our own lunapark, with insa and auto tests.

We started seemingly with a simple one, with the definition of the notation language in which we will describe the assembly. Originally tried JSON. It is simple and straightforward, very common.

ITKarma picture
JSON declaration for the project

But at the same time, for JSON, you would have to write your own config parser and implement all the assembly features from scratch. It didn’t look very impressive.

Storming the various options, we came to the conclusion that it is better to use a ready-made tool that provides basic “goodies out of the box”. But at the same time adaptable to our requirements. We settled on the latest generation collector - gradle. It inherits the capabilities of both maven and ant, is easy to configure, extends with plug-ins, has the flexibility of assembling according to dependencies, quite fast, because supports caching in incremental builds.

As a result, after 4 months, we managed to release the first version of our own gradle plug-in for building and deploying applications for the IBM i mainframe.

The collector performed its function with a bang! He hid the whole “kitchen” of code delivery to the server and compilation, publishing typical tasks to the consumer: “assemble and deploy to dev”, “form and publish the delivery”, “deploy the delivery to the environment”. It is worth paying tribute to IBM for publishing the jt400.jar package, which allows you to fully work with all IBMi services through the java interface.

From a manual deployment script in the command language of the OS:

ITKarma picture
CL - the main language for server deployment scripts

moved to a whole new level of build description.

The description of the script became “beautiful”, but it did not stop taking time. Only after a few months of “torment” they switched to a declarative description of the application and the “economical” build.gradle:

ITKarma picture

ITKarma picture

ITKarma picture

These are the "accumulative" scripts now written by developers

And after a while they came to support modularity:

ITKarma picture
Dividing into modules makes it easier to organize a large project

Not without, of course, without the difficulties of moving. It was necessary to actively develop a new tool according to the needs of workers, and transfer these same workers to new tools, and to finish the many utilities that existed at that time under new "rails" (after all, the developer had personally launched them, but now this is done by the machine). Of course, the most difficult moment was the migration of developers to new standards, new tools. I had to fight the force of habit and the accumulated conservatism for years.

Development


It is worth noting that, as in many other organizations, we were not able to form a team that was constantly dedicated to devops. As a result, support and refinement of CI-tools is carried out by the entire community of IBMi-developers. At least we try to arrive at a distributed and balanced “contribution”. If someone needs some kind of refinement, he does it himself.

Thus, having mastered and stabilized the basic CI functional, we remembered about automated quality management. Assembly is, delivery is, there are no tests. It turned out that there is nothing intelligible and widespread for the platform. There is again ARCAD, but it was abandoned at the last stage. We fully understood that the addition of autotesting is a very important evolutionary step, without which there can be no full-fledged Continuous Integration; proper speed and quality cannot be achieved without this. They also immediately considered that it would be necessary to change the long-term habits of coding tests, and this is a separate story that was to be passed.

In Alpha, the development of self-testing was developed by two parallel stories: user-interface auto-tests via GUI and testing without a GUI (we are backend developers after all). At the same time, food teams sawed their own tools, primarily for their needs.

It is worth mentioning here that, due to the large number of developers and their spread across regions, the exchange of experience and information is far from instant. Therefore, when the teams sawed something there for themselves, few knew about it. Such implementations were poorly documented and replicated. Often it was necessary to show remarkable will and perseverance to obtain information and the correct use of a "foreign" tool.

In the dissemination of new trends among the community, the traditional miti of IBMi developers helped us a lot. The first one was held in December 2017. And it’s not at all in that boring format of a “party meeting” when one broadcasts about the “five-year plan” and the rest sleep or clap. This is how it was at the last meeting -



At the event, teams can introduce other developers to their developments, get feedback, and establish team interactions. Mitap has become a kind of supermarket: everyone has something interesting that others can try; and if you already have time to try, then discuss further development and, possibly, get appropriate help. In addition to the Alfa-Bank community, developers from other companies also participate. Therefore, the annual IBMi Dev MeetUp has become an important information platform for us, an engine of progress in the community, a place for the exchange of knowledge and technology.

At the moment, the development of production tools by the community itself is the main engine of progress in the development automation on the IBM i stack.Everyone can come up with an offer, everyone can participate in the development of CI/CD and this will pump their "skills". We often talk about motivation. We say that we are forced to engage in template tasks for the sake of business. In our case, we are talking about something else. It's about creative, research work. Here, developers can, observing a certain framework, decide for themselves what to do and what not to do, they do not have to be forced to work more or less. Everything is done "in the hole." The result is not a product that can be proudly put on the table. Not at all. It turns out a product that is used by all colleagues in the organization. And now we are taking an approach where other companies can use (and buy) the product. We pass certification of Rospotrebnadzor. Here, for example, is our publication about software , which has passed such certification.

Future Plans


The CI/CD toolkit is actively developing and optimizing. For example, work continues on a system of dependencies between objects within a single project and between projects. IBM also does not stand still and introduces new, convenient chips in its development tools.

Perhaps some of the readers will not only be interested in our experience, but will offer practical ideas on the further path of improvement. We will be glad to see your support.

Thank.

Source