Access to ML model using the Pypeline library

Use the Pypeline library for local access to a trained machine learning model based on Python.

If a deployed trained ML model is accessible by a Python library, then Pypeline – a custom AnyLogic library – can be used to query it. Pypeline allows you to execute Python scripts with arguments or interactively run Python code from within your AnyLogic models, using a local installation of Python.

Download Library

Any model developed in AnyLogic is a Java application that can be customized natively using the Java programming language. However, it is possible to introduce Python as an auxiliary programming language within your models, enabling the opportunity to use Python-based libraries and algorithms.

The Pypeline library can be used for cases such as:

  • Utilizing code that was originally written in Python without having to port to Java.
  • Writing complex algorithms in Python that you can call in Java, optionally passing objects/data between the languages.
  • Working with any Python-exclusive library.
  • Using simulation as a testbed for testing trained artificial intelligence policies.

It’s compatible with AnyLogic with any valid license (Personal Learning Edition, University, or Professional) and any version of Python 3 (except from the Windows store) installed on your computer.

To help data transfer between your Python and AnyLogic (Java) environments, Pypeline also comes with features to convert data objects to and from the JSON format. It is an open standard file format and data interchange format that is friendly to parse for both humans and computers.

The AnyLogic Company has developed Pypeline as a third-party library that is free, open source, and an optional connectivity tool, without obligation to provide official support or promise of compatibility of any kind. Users of the library may ask questions or provide comments on the Issue or Discussion tabs of the GitHub page.

Webinar: Pypeline – A Python Connector Library for AnyLogic

This webinar offers a comprehensive introduction to Pypeline. The presentation begins with an explanation of its definition and diverse use cases. This is followed by an in-depth technical exploration, demonstrating how to optimally integrate Pypeline into simulation models. For the latest updates, including new settings and features, please refer to the project's GitHub repository.

Example models


The examples below demonstrate various Pypeline library use cases and levels of complexity; they are available from the GitHub repository and the internal repository of models shipped with any edition of AnyLogic (AnyLogic welcome screen → Example models → How-to models/Examples).


  • 01

    Basic Functionality
    (Interactive Introduction to Pypeline)

    Pypeline Library Basic Functionality model

    This is an introductory demo in which you can explore the basic functionality that Pypeline provides in three different functions – “run”, “runResults”, and “runFile”.

    Unlike any other model, it’s set up to allow you to interactively configure the PyCommunicator object and to send or receive information to or from the underlying Python environment. In this demo, major features are shown to give a hands-on introduction to using Pypeline.

    Explore

  • 02

    Simple Hospital (AI Testbed)

    Simple Hospital model

    This model depicts a simplified hospital where patients arrive, stay for some time, and then leave. Two neural networks are utilized: one for predicting the arrival rate of patients based on the last days’ worth of arrival rates, and a second for predicting patient length of stay based on 24 health-related attributes of the individual.

    Explore

  • 03

    Supply Chain Optimizer

    Supply Chain Optimizer model

    This is a model of a 3-echelon supply chain for a single product. Customers request varying amounts of the product weekly from factories, which can store them in warehouses.

    The model includes costs associated with transporting the product between the three levels of the supply chain. The supply chain is imported from an Excel file to the database on startup. At the start of each week, the current model state is sent to a Python optimizer to determine how much product to send and to whom.

    Explore

  • 04

    Traveling Salesman

    Traveling Salesman model

    This model showcases an example of a Traveling Salesman problem (TSP). In the TSP, a set of cities are present, with one designated as the home location.

    Given an arbitrary number of cities to visit, the goal is to find the shortest route that starts and ends at the home location. To optimize the order, the Python library OR-tools (by Google) is used.

    Explore

  • 05

    Lorenz Weather Model
    (with Pypeline)

    Lorenz Weather Model (with Pypeline) model

    This model is a slightly modified version of the original AnyLogic example model. The original one is a System Dynamics model demonstrating a chaotic yet deterministic system. When the model starts, Pypeline will spawn a new 3D window (via the Python library “MatPlotLib”) and begin streaming data to it.

    You can interact with it by dragging the 3D graph to view the model from different angles or by changing the sliders in the simulation and viewing the results. This model also demonstrates Pypeline’s ability to function in parallel simulations without any extra code.

    Explore

  • 06

    Interconnected Call Centers
    (Web App with Pypeline)

    Interconnected Call Centers (Web App with Pypeline) model

    This model is a slightly modified version of the original AnyLogic example model “Interconnected Call Centers”. It shows how to use dashboard libraries - "Dash" library in this case - to build a live web app for the executing model.

    After launching the model, the web app will be launched in a background Python environment. You can open a new browser tab to "localhost:8050" to view it. Periodically, the model will transmit its current state to Python, which will then update the web app. You can use this strategy to create alternative interfaces for analyzing your model as it runs.

    Explore

  • 07

    Initializing and Exporting with JSON

    Initializing and Exporting with JSON

    This model demonstrates a basic use case of the JSON features built into Pypeline. These include the ability to serialize model objects to JSON (either to transfer data to Python or for logging purposes) and the ability to deserialize JSON text to agents and populations.

    The serialization features are also shown to be customized through a built-in filter object. The set of JSON features comes from an AnyLogic JSONifier library built into Pypeline.

    Explore

  • 08

    Machine Optimizer

    Machine Optimizer model

    This model demonstrates solving static optimization using the Python libraries Numpy and Scipy. In the model, there are three types of machines; each has its own production cost and is built in three stages, with each stage having a different time requirement per stage.

    There are two constraints: the allotted number of machines built and the cumulative time allotted for each stage. The goal is to change the number of each type built so that the production cost is minimized.

    Explore

  • 09

    Python From Experiment Screen

    Python From Experiment Screen model

    This is a proof-of-concept model for how to use Pypeline’s PyCommunicator object from within an experiment. Although the PyCommunicator object is technically an agent, by initializing it from certain experiment action fields, Python can be called upon to, in this example, generate the next values in a parameter variation experiment.

    Explore

Simulation for training and testing AI – Email Pack

AnyLogic simulation is the training and testing platform for AI in business. With AnyLogic general-purpose simulation, you can construct detailed and robust virtual environments for training and testing your AI models. The unique multi-method simulation capabilities provide a comprehensive tool for use in machine learning. Established in use at leading companies across industries, this fully cloud enabled platform with open API is enhancing and accelerating AI development today. Find out more about this powerful machine learning tool in our AI email pack and white paper!

AI pack and white paper