Skip to main content
Skip table of contents

BI-data Interface

Introduction

This is a standard export functionality that can be used with any of the BI systems(Microsoft Power BI, Tableau, Qlik) for data visualization to identify factory trends and spot business problems.

One can use this integration to regularly push data from RS Production to a remote destination by setting up a scheduled export with the destination details specified.

Destinations

This function offers several different types of destinations

Destination

File and format

RS Production Edge Service Required

Azure Blob Storage

CSV/JSON

No

Azure SQL database

Database file

No

FTP Server

CSV/JSON

No (but a FTP software)

Local file folder

CSV/JSON

Yes

Local SQL Server Database

Database file

Yes

Azure Blob Storage

Good for you who want to use data from RS Production with something you develop on Azure.

To use this, you need a blob storage container on your Azure account and also create a SAS URL for the blob storage resource.

  1. Create a storage account in your Azure portal

  2. Create a container in the blob storage. The container name should be the same as the six digit RS Production installation number.

  3. Add an Access policy on the container by specifying the start and expiry time for a user to access it.

  4. Generate SAS URL by selecting the newly created access policy, saving the blob SAS URL, and adding it to the RS Production's setting.

Azure SQL database

Good for you who want to use data from RS Production with something you develop on Azure.

FTP Server

Good for you who develop something locally.

Local file folder

Good for you who develop something locally. This one needs a local RS Production Edge Service.

Read more about the Edge Service and its safe way to communicate with the cloud.

Read more about how to Install Edge Service.

Local SQL server database

Good for you who develop something locally. This one needs a local RS Production Edge Service.

Read more about the Edge Service and its safe way to communicate with the cloud.

Read more about how to Install Edge Service.

Settings

This feature can be found and configured in the Settings menu under the Integration section.

On activating the BI export, the below setting should be filled out -

Time interval

Select the time frame from the dropdown list to export x days of data regularly.

  • Last 30 days

  • Last 7 days

  • From yesterday

  • Today

Export scheduling

Set the frequency for scheduled export to run on x days at x intervals.

Destination

Choose the external location from the available list and set its respective details needed for export.

Azure blob storage

  • File type - CSV/JSON

  • SAS URL - A link to the container of the Azure storage account where the files would be uploaded.

    • SAS stands for Shared Access Signature which provides secured access to the resources in a storage account. With SAS, one can control how a third person can access the data and set the permission and validity for it.

    • The customer creates a blob storage container in their Azure subscription and then creates the SAS URL for the blob storage resource.

      Steps in order to create the container

      • Create a storage account in their Azure portal

      • Create a container in the blob storage

      • The container name should be the same as the installation number

      • Add an Access policy on the container by specifying the start and expiry time for a user to access it.

      • Generate SAS URI by selecting the newly created access policy, saving the blob SAS URL, and adding it to the RS Production's setting

Azure SQL database

  • Server name

    • Username

    • Password

    • Database name

FTP Server - Integration service is required

  • File type - CSV/JSON

  • Server name - FTP host

  • Username

  • Password

  • Directory - Mention the path where you want to receive the files

Local folder - works only on local installations

  • File type - CSV/JSON

    • Directory - Mention the path where you want to receive the files (currently works only on the location where the installation is setup)

SQL Server database - Integration service is required

  • Server name

    • Username

    • Password

    • Database name

This is included in the data export

RS Production exports the following data:

  1. Setting - contains the info determining how the OEE should be calculated

  2. Stop - contains the stop info that has occurred on every shift of the measure point

  3. Worktime - contains the order/article info that has run on every shift of the measure point. Here the run is stored on an hourly basis.

  4. StopOccasion(data needs to be checked)

1. Setting

Fields

Data type

Comment

InstallationID

 

Installation GUID

Installation

 

Name of the installation

CalculatePerformanceFromWeightedProducedUnits

 

Indicates if the produced amount on all the measure points are calculated equally regardless of what speed they are defined to run on.

MicrostopAsPerformanceLoss

 

Indicates if the loss for microstops are moved from availability loss to speed loss

ReworkAsQualityLoss

 

Indicates if the loss for rework are moved from speed loss to quality loss

2. Stop

Fields

Comment

InstallationID

Installation GUID

Installation

Name of the installation

MeasurePointID

Measure point GUID

MeasurePoint

Name of the measure point/machine

IntervalStart

Start time of a stop

IntervalEnd

End time of a stop

TotalStopDuration

Total stop time duration

StopReason

If a stop is microstop, then it returns as ‘Micro stops’

If there is a stop on no worktime, then it returns as ‘No active order’

If there is no schedule on a machine, then it returns as ‘No schedule’

If a stop is not categorized, then it returns as ‘Uncategorized’

If a stop is categorized, then it returns the stop reason code.

Categories

Interrupt reason categories

Station

Name of the station on which the stop occurred

Comment

 

ShiftID

GUID of the shift instance when the stop occurred

If a stop is of a longer duration, then ShiftID returns as an array of shift instance guids separated by comma(,).

3. Worktime

Fields

Comment

InstallationID

Installation GUID

Installation

Name of the installation

MeasurePointID

Measure point GUID

MeasurePoint

Name of the measure point/machine

IntervalStart

Start time of the worktime

IntervalEnd

End time of the worktime

ShiftID

GUID of the shift instance

Shift

Name of the shift

ProductionOrder

Order number

Article

Article number

ArticleName

Article name

ArticleType

Article type

TotalDuration

The total duration(in seconds)

ScheduledDuration

Time scheduled on the measure point

ExcludedDuration

Time on which the stops were marked as excluded

StopDuration

The total time when the measure point is stopped

SetupStopDuration

The total time of changeovers

NoWorkTimeStopDuration

 

MicroStopDuration

The total time of all microstops combined

ProductionTimeDuration

The total time when the measure point is producing

UsedEffectiveTime

 

ReworkedEffectiveTime

 

ScrappedEffectiveTime

 

OptimalProducedUnitsNoMicroStop

 

OptimalProducedUnits

Amount of optimal produced units on time when machine was available

ProducedUnits

Amount of produced units

ApprovedUnits

Amount of approved units

ScrappedUnits

Amount of scrapped units

ReworkedUnits

Amount of reworked units

4. Stop Occasion

Fields

Comment

InstallationID

Installation GUID

Installation

Name of the installation

MeasurePointID

Measure point GUID

MeasurePoint

Name of the measure point/machine

IntervalStart

Start time of a stop

IntervalEnd

End time of a stop

Duration

 

TotalStopDuration

Total stop time duration

ScheduledDuration

 

ProductiveDuration

 

StopQuantity

 

NumberOfProductiveOccasions

 

MTBF

 

MTTR

 

OEE calculation

Availability

((ScheduledDuration- StopDuration- SetupStopDuration- NoWorkTimeStopDuration+MicroStopDuration) / (ScheduledDuration_data - ExcludedDuration_data)) * 100

Performance

  • It has 4 different calculations based on the settings as described in the “Settings” table.

  • If MicrostopAsPerformanceLoss & CalculatePerformanceFromWeightedProducedUnits settings are on:

    • ProductionTimeDuration_data + MicroStopDuration_data > 0 ? ((UsedEffectiveTime_data - (ReworkAsQualityLoss ? 0 : ReworkedEffectiveTime_data)) / (ProductionTimeDuration_data + MicroStopDuration_data)) * 100 : 100

  • If only MicrostopAsPerformanceLoss setting is on:

    • OptimalProducedUnitsNoMicroStop_data > 0 ? (OptimalProducedUnitsNoMicroStop_data > ProducedUnits_Scheduled_data ? ((ProducedUnits_Scheduled_data - (ReworkAsQualityLoss ? 0 : ReworkedUnits_Scheduled_data)) / OptimalProducedUnitsNoMicroStop_data) * 100 : 100) : 100

  • If only the CalculatePerformanceFromWeightedProducedUnits setting is on:

    • ProductionTimeDuration_data > 0 ? ((UsedEffectiveTime_data - (ReworkAsQualityLoss ? 0 : ReworkedEffectiveTime_data)) / ProductionTimeDuration_data) * 100 : 100

  • If none of the settings are on:

    •  OptimalProducedUnits_data > 0 ? (OptimalProducedUnits_data > ProducedUnits_Scheduled_data ? ((ProducedUnits_Scheduled_data - (ReworkAsQualityLoss ? 0 : ReworkedUnits_Scheduled_data)) / OptimalProducedUnits_data) * 100 : 100) : 100

Quality

  • It has 2 different calculations based on the settings as described in the “Settings” table.

  • If the setting CalculatePerformanceFromWeightedProducedUnits is on:

    • UsedEffectiveTime_data - ScrappedEffectiveTime_data - (ReworkAsQualityLoss ? ReworkedEffectiveTime_data : 0) >= 0 ? (UsedEffectiveTime_data > 0 ? ((UsedEffectiveTime_data - ScrappedEffectiveTime_data - (ReworkAsQualityLoss ? ReworkedEffectiveTime_data : 0)) / UsedEffectiveTime_data) * 100 : 100) : 0

  • If no setting is on:

    • ProducedUnits_Scheduled_data - ScrappedUnits_Scheduled_data - (ReworkAsQualityLoss ? ReworkedUnits_Scheduled_data : 0) >= 0 ? (ProducedUnits_Scheduled_data > 0 ? ((ProducedUnits_Scheduled_data - ScrappedUnits_Scheduled_data - (ReworkAsQualityLoss ? ReworkedUnits_Scheduled_data : 0)) / ProducedUnits_Scheduled_data) * 100 : 100) : 0

Example SQL file

BI export - SQL script.sql

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.