Low-Code AI Tools: How to Implement AI-Based Apps Easily
This is part of Solutions Review’s Premium Content Series, a collection of contributed columns written by industry experts in maturing software categories. In this submission, KNIME Head of Data Science Evangelism Rosaria Silipo in collaboration with Aline Bessa and Emilio Silvestri offers advice on how to implement AI-based applications easily with the use of low-code AI tools.
It has become easier and easier to build artificial intelligence (AI) solutions. You do not need to be a full-fledged data scientist to train a machine learning (ML) model. You do not need to be a full-fledged programmer to include AI in your business applications. The right low-code tools can take you a long way in developing and productionizing AI-based applications.
The AI-Based Application
Recently, we developed a relatively complex AI application that includes:
- Data collection in the form of user feedback to a simple online game.
- Sentiment analysis of the user feedback with various ML and natural language processing (NLP) models.
- Productionization and orchestration of all modules together.
The AI models were trained and the single modules were developed using a low-code platform for data science. The final application was then made available to the real world through a dedicated deployment and productionization platform.
The Online Game and User Feedback Collection
The online game was a “Guess the Flag” game. The user is presented with a randomly selected flag, and they have three chances to guess the country to which it belongs. Each game session includes 10 flags. At the end of each session, the statistics of the user’s success and failure are presented via a dashboard, and the user gets the chance to leave a review.
The whole game was implemented as a web application without writing one single line of code. The low-code tool for data science, KNIME Analytics Platform, relies on the concept of “components” to implement interactive web pages. While originally intended to resemble just dashboards, KNIME components are so flexible that we could implement the whole web interface of the game with it:
- The flag presentation
- The answer options
- The final stats dashboard
- The feedback form
The Sentiment Analysis Application
On all reviews collected in the past 24 hours, a sentiment analysis application was applied to evaluate the user experience of the game. There are many possible ways of implementing a sentiment analysis application, depending on available time, costs, expectations, and data.
- A solution based on classic NLP rules and statistics is easy to implement but requires some specific linguistic knowledge.
- In the last few years, solutions based on machine learning algorithms have been preferred. In this case, however, a representative data set should be available to train the algorithm.
- Still in the realm of machine learning-based sentiment analysis applications, deep learning models have shown better performances, though requiring, even more, example data for training.
- Recently, pre-trained transformer models have been made available by companies with access to large amounts of text data.
Below is the KNIME workflow that implements a classic NLP-based solution for sentiment analysis. We chose to implement this NLP-based approach for its simplicity. Any other of the above-listed approaches could have been equally implemented.
Productionization and Orchestration
Before moving both applications, the flag game, and the sentiment analysis, into production, we modularized them as much as possible. One service implemented the flag and answer evaluation, one service the feedback collection, one service the sentiment analysis, and one the report generation. Each service consisted of a logical block and was productionized as a self-contained REST service.
The orchestration of all services was obtained via the KNIME Server’s features. KNIME Server can schedule a module execution, call modules in cascade after other modules’ execution, and exploit services provided by another module using the Call Workflow Service node.
The Steps to Success
It seems easy, right? And it was. However, there is more than meets the eye in this project. Many steps were required for the successful implementation:
- Management acceptance
- Project definition
- Skill set scouting
- Web development
- AI model training and deployment
- Service productionization
Let’s start from the bottom of the list. The open-source low-code tool that we used, KNIME Analytics Platform, took care of web development, AI model training, and model deployment. Its commercial companion, KNIME Server, took care of the productionization and orchestration. Both turned out to be the easiest parts of the whole project, mainly thanks to the low-code approach.
Thus, the key to success for this project was to reach the development phase with very clear ideas of what we wanted to develop, the resources we could dedicate, and how much support we needed from management. Nothing really compares to clear ideas. Indeed, this was the most complex part of the project: understanding what to develop and how all pieces could work together. It took a few meetings and more than a few conversations to reach a clear, final vision.
The goal of the project had always been to evaluate feedback left by users. We started from book reviews, but then users would have needed first to read at least one or two books and then leave feedback. The feedback collection process would have been too slow. We thought of songs, but the copyright issue arose. We thought of restaurants, but here we would not have been the owners of the data. Then one of us had the idea of building a game. Gamification is a powerful tool to get users to interact. In order to avoid copyright issues, we made up our own game with flags, to which we appended a feedback collection form. The web development here would need some basic UI skills to make it as easy as possible to use.
The second part was the evaluation of the feedback texts. For that, even a junior data scientist would have been able to implement a sufficiently accurate solution based on one of the four sentiment analysis approaches listed above.
Finally, the orchestration and productionization process was to be entrusted to a data engineer, someone with an eye for detecting logically isolated tasks and turning them into services.