Algolytics Technologies Documentation
  • End-to-end Data Science Platform
  • ABM
    • Introduction to ABM
    • Repository
    • Classification - adding, running and deleting projects
    • Approximation - adding, running and deleting projects
    • Models and variables statistics
    • Model deployment
    • ABM API
    • Data scoring
    • Adding, running and deleting projects
  • Event Engine [user]
    • Engine description
    • How the engine works
    • Events
    • Aggregate module
    • Metadata
    • Components of metadata
    • Off-line processing and modeling
    • Examples of API operations
    • Visualisation
  • Event Engine [administrator]
  • Scoring.One
    • Engine description
    • Panels overview
    • Implementation of scoring models
    • Creating and testing a scenario
    • SCE Tracking Script
  • Advanced Miner
    • Documentation
    • How to install license key
  • DataQuality [web app]
  • Algolytics APIs
    • DQ for Python API
    • Scoring Engine WEB API
    • ABM Restfull API
    • Other APIs
  • Privacy policy
  • GDPR
Powered by GitBook
On this page
  • Upload new model
  • Import from json
  • Export
  • Testing a code
  1. Scoring.One

Implementation of scoring models

PreviousPanels overviewNextCreating and testing a scenario

Last updated 4 months ago

Scoring Engine supports scoring models in 4 formats: Java, PMML, Python or R code. Added codes can be queried by engine or used in scenarios.

Upload new model

Scoring code can be added in Scoring code management panel. To add a scoring code, click the Scoring code action button and select Upload new model option, then enter the name of the code and drag the file (or a .zip file in case of R models) with the code to designated area on the right side of the screen, or click this area and select the file with the code to add.

A .zip file containing R model must contain:

  • .R file with script containing transformations, calculations and/or models stored in RDS files. A variable with script output must be named rResult, RDS files can be used by refering to their names

  • CSV file with input variables (as described in section)

  • (optionally) RDS file(s) with model(s), these models can be used in the script by reference by file name. To save model as an RDS file, use saveRDS function in R.

a .zip file containg Python model must contain:

  • a .pkl file with a serialized python object.

  • CSV file with input variables (as described in section)

  • the .py file should contain a scoring function with access to input variables and model object unpickled from model.pkl

Sample Python model calculates a grade based on the given percentage .zip:

Sample file for creating a .pkl:

import pickle

data = {
    '2': 0,
    '3': 50,
    '3.5': 60,
    '4': 70,
    '4.5': 80,
    '5': 90
}

with open('model.pkl', 'wb') as f:
    pickle.dump(data, f)

A list of added codes is presented on the bottom of the panel.

Import from json

Scoring code can be imported in Scoring code management panel. To import code click the Scoring code action button and select Import from json option, then drag the file with the code to designated area on the right side of the screen, or click this area and select the file with the code to add.

Export

To download all scoring codes in JSON file, click the Scoring code actions button and select the Export option.

Testing a code

Added codes can be queries in Forms panel. For this purpose, select a model from model drop-down list, enter proper values of displayed input variables (defined in the code), and then click the Query button. If the query is correct, output variables (defined in the code) will be displayed. Otherwise, the engine will display an appropriate error. Queries can be viewed in Browse scoring results panel, after choosing Scoring model or Pmml, R or Python model in Result type drop-down list.

Creating and testing a scenario
Creating and testing a scenario
Download