The System Scheduling (smoBackOffice)


smoBackOffice is a system scheduling tool used or running processing referenced in a configuration file, such as Discovery or Network data integration or preimport tasks.

Operating principle

  • Each type of processing is enabled or disabled depending on customer requirements.
  • The scheduling tool launches each process referenced in the configuration file, in the listed order and depending on the execution scheduled time.
  • Note: Priority will be given to the ongoing tasks that will not be interrupted. The new tasks will be queued.

Notes

  • smoBackOffice is shipped as an executable in the <Easyvista>\tools\ folder.
         <Easyvista>: the EasyVista folder on the application server

   Open url.png  See How to choose the best tool for automating integration.

Caution

 

smoBackOffice.cfg configuration file

[HTTP]
PORT=8091

;  ==>  For Network integration:
;       * indicate the configuration settings
[NETWORK]
;       * listening port (default=1234)
PORT=31163

;  ==>  For Discovery integration:
;       * indicate if Discovery data can be collected: ACTIVE parameter (Yes=1/No=0)
;       * indicate how data is collected: locally or via an FTP/SMTP protocol (active=1/inactive=0)
;       * specify the corresponding section [… Connect]
[GetData]
ACTIVE=0
FTP=0
SMTP=1
LOCAL=1
;       * DELAY = n minutes
DELAY=1
;       * Start and end integration times
Begin_Time=00:00:00:000
End_Time=23:59:59:999
LOG_DEBUG=0

;  ==>  For Discovery integration:
;       * indicate the configuration settings of the protocol if it is active in the [GetData] section
[FTP Connect]
rem Host=
rem Port=22 pour sftp
rem Port=21
rem UserID=
rem Password=
rem Directory=
rem Ssl=SFTP
HOST=127.0.0.1
USERID=easyvista
DIRECTORY=data
PORT=21

;  ==>  For Discovery integration:
;       * indicate the configuration settings of the protocol if it is active in the [GetData] section
[POP3 Connect]    
rem Host=
rem Port=110
rem UserID=
rem Password=
Host=127.0.0.1
Port=110
UserID=easyvista@aef.com
Password=staff

;  ==>  For Discovery integration:
;       * indicate the EasyVista work folders
[Working folders]
DownloadDir=c:\easyvista\cust\done
UnZipDir=c:\easyvista\cust\
CSVDIR=\\127.0.0.1\report\
PDFDIR=\\127.0.0.1\report\
LocalResultDir=c:\Easyvista\Inventory

[GLOBAL]
MODE=SQL
;DELAY = n minutes
DELAY=1
;   * Authorize the encryption of passwords and connection strings in the .ini preimport files
CryptedPasswords=true

;  ==>  For smoBackOffice:
;       * indicate if integration can be run: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[IMPORT]
ACTIVE=0
MAX_PACKET_SIZE=20
MAX_THREAD=10
;       * Start and end times for running the processing
Begin_Time=00:00:00:000
End_Time=23:59:59:999
;       * DELAY = n minutes
DELAY=1
LOG_DEBUG=0
;       * For SQL SERVER
BULK_EXE=C:\Pro
;       * For ORACLE
;BULK_EXE=C:\Program Files\oracle\80\Tools\Binn\SQLLDR.exe

;  ==>  For smoBackOffice:
;       * indicate if the Discovery module is installed: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[DISCOVERY]
ACTIVE=1
;       * Start and end times for running the integration Discovery
Begin_Time=00:00:00:000
End_Time=23:59:59:999
;TEMPORARY_TABLE_NAME=
;ACCOUNT=23865
LOG_DEBUG=0
DO_NOT_DELETE_TEMPTABLE=0
PDF_ACTIVE=0
PDF_TEMPLATE=A-@_LANGUAGE_@-Inventory.rpt
PDF_EXPORT=inventory.pdf
CSV_ACTIVE=0
CSV_TEMPLATE=inventory.sql

;  ==>  For smoBackOffice:
;       * indicate if statistics processing can be run: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[STAT]
ACTIVE=0
;       * DELAY = n minutes
DELAY=30
;       * Start and end times for running statistics processing
Begin_Time=00:00:00:000
End_Time=23:59:59:999
LOG_DEBUG=0

;  ==>  For smoBackOffice:
;       * indicate if Network processing can be run: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[EZVNET]
ACTIVE=0
;DELAY = n days
DELAY=1
LOG_DEBUG=0
;       * Start and end times for running the integration Network
Begin_Time=00:00:00:000
End_Time=23:59:59:999
;MODE = FTP or POP3 or LOCAL
MODE=FTP
;POP3_Host=
;POP3_UserId=
;POP3_Password=
;POP3_Port=110
;EZVNetDir=CONFIGURE THIS VALUE WHEN MODE=LOCAL

;  ==>  For smoBackOffice and smoBackOfficeClient:
;       * indicate the database access parameters
[SQL Connection]
Server=EZV_SQL1
UserId=EZV_ADMIN
Password=password
BO_OWNER=EZV_ADMIN
BO_DatabaseName=EVO_BACKOFFICE
ADMIN_OWNER=EZV_ADMIN
ADMIN_DatabaseName=EVO_ADMIN
REF_OWNER= EZV_ADMIN
DateFormat=MM/DD/YYYY

[Administrator]
;Email=ReturnAddr
;SMTPHost=ReturnAddr
;SMTPUserId=ReturnAddr
;ReturnAddr=ReturnAddr
;Port=25

;  ==>  For smoBackOffice and smoBackOfficeClient:
;       * indicate if preimport processing can be run: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[PREIMPORT]
ACTIVE=1
SAAS=1
;       * in seconds
DELAY=300
LOG_DEBUG=0
;       * Start and end times for running the preimport processing
Begin_Time=00:00:00:000
End_Time=23:59:59:999

[LOG]
LEVEL=0

;  ==>  For smoBackOffice:
;       * indicate if purge processing can be run on Discovery files: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[PURGE]
ACTIVE=0
;in days
DELAY=1
LOG_DEBUG=0
;       * Start and end times for running purge processing
Begin_Time=00:00:00:000
End_Time=23:59:59:999
;       * in days
;DELETE_DISCOVERY_HISTORY=150
;DELETE_DISCOVERY_IN=100
;DELETE_CITRIX_HISTORY=150
;DELETE_CITRIX_IN=100

;  ==>  For smoBackOffice:
;       * indicate if Citrix-related processing can be run on Discovery integration data: ACTIVE parameter (Yes=1/No=0)
;       * then specify the configuration settings if active
[CITRIX]
ACTIVE=0
;        Start and end times for running Citrix-related processing
Begin_Time=00:00:00:000
End_Time=23:59:59:999
LOG_DEBUG=0
DO_NOT_DELETE_TEMPTABLE=0

[INSTALL]
VERSION=1.0

 

Preimport processing

The preimport task is a preliminary process run before third-party data is integrated in the Product name - ev itsm.png database. It is used to automate processing that cannot be performed using an integration model.

Operating principle
         Preimport Process.png

1. Third-party data sources (e.g. CSV file, database or directory) are read using the connection settings defined in .ini configuration files.

2. Data are stored in Product name - ev itsm.png work tables.

3. The processing referenced and scheduled in the .ini configuration files is run.

4. All of the data processed is stored in the final Product name - ev itsm.png tables. You can filter the tables to retrieve only data that is new or modified since the previous processing.

Examples

  • Preimport processing to run SQL scripts before running the model, e.g. data formatting, deletion, normalization, delta differencing.
  • Preimport task to synchronize third-party data with EasyVista, e.g. incidents.

Notes

  • A preimport task can read and process different third-party data sources.
  • Each preimport task can be run regularly based on the frequency defined in .ini configuration files.

   Open url.png  See How to choose the best tool for automating integration.

Caution

  • The scripts for creating or deleting Product name - ev itsm.png objects must always be run in the SQL Server database before running the preimport task.

Best Practice big icon.pngBest practice

  • How to ensure optimal performance of the Product name - ev itsm.png platform:
    • You should schedule preimport tasks to be run whenever possible outside platform production periods, i.e. at night, during weekends, or during periods when 24/7 platforms are accessed by the least number of users.
    • Optimize data processing time and load by performing incremental integration instead of running full integration.

Description of preimport files

PREIMPORT_SQL.sql

  • This is the general scheduler for the preimport process.
  • It indicates the paths of all .ini configuration files used by the process and determines the processing sequence.
  • It must be placed in the /Preimport/ folder in smoBackOffice.
[PREIMPORT]
import_csv.ini=
import_ldap.ini=

 

.ini configuration files

  • They indicate:
    • The third-party data sources used and the connection settings to each of them.
    • The different processing to be run and their sequence.
  • They must be placed in the /Preimport/ folder in smoBackOffice.
  • SQL scripts must be placed in a subfolder in the /Preimport/ folder, e.g. /Preimport/CUSTOMERNAME.
[DEFINITION]
;  ==>  General elements in the .ini file, such as run time and frequency
;       * indicate if the file can be run: ACTIVE parameter (Yes=1/No=0)
;
NAME=IMPORT EXAMPLE
ACTIVE=1
;in days
FREQUENCY=1
BEGINTIME=06:00:00
LOG_DEBUG=1
DO_NOT_DELETE_TEMPTABLE=0
;last execution date

[PREPROCESS]
;  ==>  List of SQL scripts to be run before reading or loading data
;
;script_preprocess.sql=
FCT=RESULTAT=WSDL(EVO_BACKOFFICE,50004,TEST_WSDL,ID)

[CLEAN]
;scriptclean.sql=

[PROCESS]
;  ==>  List of SQL scripts to be run for loading raw data
;
;IMPORTLDAP1=

[POSTPROCESS]
;  ==>  List of SQL scripts to be run after loading data
;
;script_postprocess.sql=

[IMPORTLDAP1]
;  ==>  Connection settings to one of the data sources declared in the .ini configuration file
;       * name the section based on the type of data source: [IMPORTLDAP], [IMPORTCSV], [IMPORTORACLE], [IMPORTADO], ...
;
DBKERNEL=LDAP
HOSTNAME_1=MyLDAPServerIP
LOGIN_1=MyLogin
PASSWORD_1=MyPassword
DOMAIN_1=DC=staffandline,DC=local      
OBJECT_1=(|(&(CN=MYPC)(objectClass=*))(&(CN=MY2PC)(objectClass=*)))
FIELDS_1=DN,ObjectGuid
TABLE_1=MyTable
UNICODE_1=0
DNUNICODE_1=0
SINGLEPASS_1=1
DO_NOT_DELETE_TEMPTABLE=1
protocol_1=3
;ROW_SEPARATOR_1=ONELARGEROWSEPARATOR
;LINE_SEPARATOR_1=ONELARGELINESEPARATOR

[IMPORTLDAP2]
DBKERNEL=LDAP2
HOSTNAME_1=1.2.3.4
PORT_1=389
LOGIN_1=LDAPAccount
PASSWORD_1=********
DOMAIN_1=DC=CUSTOMERNAME,DC=com
OBJECT_1=(&(objectcategory=person)(objectClass=user))
FIELDS_1=cn,displayName,mail,sAMAccountName,c,l,title,telephoneNumber,company,department,mobile,manager,locale,language,countryCode,DN
TABLE_1=E_LDAP_CUSTOMERNAME_TEMP
DO_NOT_DELETE_TEMPTABLE=1
ORGANISATION_FILTER_1=(OU=*)
PROTOCOL_1=3
UNICODE_1=1
DNUNICODE_1=0
SinglePass_1=1

[IMPORTADO1]
DBKERNEL=ADO
CONNECTIONSTRING_1=Provider=sqloledb;Data Source=0627-sf\OFFICIEL;Initial Catalog=EVO_ADMIN;User Id=EZV_ADMIN;Password=staff;
TABLE_1=MyTable
REQUEST_1=MyRequest.sql
DO_NOT_DELETE_TEMPTABLE=1
ROW_SEPARATOR_1=ONELARGEROWSEPARATOR
LINE_SEPARATOR_1=ONELARGELINESEPARATOR

[IMPORTORACLE1]
DBKERNEL=ORACLE
HOST=OBELIX
LOGIN=
PASSWORD=
DATABASE=
TABLE_1=
REQUEST_1=MyRequest.sql
TABLE_2=
REQUEST_2=MyRequest.sql
DO_NOT_DELETE_TEMPTABLE=1

[IMPORTCSV1]
DBKERNEL=CSV
TABLE_1=TABLE_CSV_TEMP
DIRECTORY_1=C:\easyvista\tools\smoBackOffice\QUERIES\SQL\PREIMPORT\
EXTENTION_1=MyFile.csv
SEPARATOR_1=;
DO_NOT_DELETE_TEMP_TABLE=1

Section [PREPROCESS]

This section lists the SQL scripts to be run before reading or loading data. These are usually queries for purging tables.

    Example documentation icon EN.png

  • Purge table MyTable
  • Run scripts found in CUSTOMERNAME. Note: Remember to add the symbol = that marks the end of an instruction.
DELETE FROM mytable
CUSTOMERNAME\LDAP_TRUNCATE_TABLE_LDAP_CUSTOMERNAME_TEMP.sql=
CUSTOMERNAME\LDAP_TRUNCATE_TABLE_LDAP_CUSTOMERNAME_OK.sql=

Section [PROCESS]

This section lists SQL scripts to be run for loading raw data in Product name - ev itsm.png work tables. They depend on the third-party data source and can be used to populate new Product name - ev itsm.png work tables.

    Example documentation icon EN.png

  • Run a script in the CUSTOMERNAME folder. Note: Remember to add the symbol = that marks the end of an instruction.
CUSTOMERNAME\LDAP_INSERT_TABLE_LDAP_CUSTOMERNAME_OK.sql=

Section [POSTPROCESS]

This section lists SQL scripts to be run for data loaded in Product name - ev itsm.png work tables.

  • Standard Product name - ev itsm.png scripts for standardizing data.
        Example documentation icon EN.png
    • Convert all dates to the Product name - ev itsm.png format
    • Then store it in a new work table
INSERT INTO [EVO_BACKOFFICE].ezv_admin.e_csv_temp2
            ([qualite],lastname,firstname,[directory],[start_date],[end_date],[email],[phonefixext],[telcom],[cellphone],[sitecode],[desk],[section],[subsidiary],[division],[FolderN1],[department],[login])
SELECT Isnull(Rtrim(Upper([civility])), ''),
      Isnull(Upper([firstname]), ''),
      Isnull(Lower([lastname]), ''),
      Isnull([directory], ''),
      Cast(( Substring(start_date, 7, 4) + '-' + Substring(start_date, 4, 2) + '-' + Substring(start_date, 1, 2) + ' 00:00:00.000' ) AS DATETIME),
      Cast(( Substring(end_date, 7, 4) + '-' + Substring(end_date, 4, 2) + '-' + Substring(end_date, 1, 2) + ' 00:00:00.000' ) AS DATETIME) AS date_fin
FROM   [EVO_BACKOFFICE].ezv_admin.[e_csv_temp]
WHERE  firstname NOT LIKE 'TEST%'
      AND directory NOT LIKE 'RET%'
      AND sitecode NOT LIKE '%--%'--
  • Customized scripts, specific to each customer, for normalizing data
        Example documentation icon EN.png
    • Store values in MB instead of GB
    • Retrieve the location using three customer data fields
    • Number the graphics card for each workstation
       
    • Normalize titles in the work table
UPDATE [EVO_BACKOFFICE].ezv_admin.e_csv_temp
SET    civilite = CASE qualite
                   WHEN 'Sir' THEN 'Mr.'
                   WHEN 'sir' THEN 'Mr.'
                   WHEN 'Madam'THEN 'Ms.'
                   WHEN 'Miss' THEN 'Ms.'
                   WHEN 'Mrs' THEN 'Ms.'
                   ELSE qualite
                 END
FROM   [EVO_BACKOFFICE].ezv_admin.e_csv_temp
  • Data normalization scripts that can be customized by each customer, for integration gateways.
        Example documentation icon EN.png
    • Monitor all SCCM names for the versions of a given software grouped under a single catalog entry
  • Scripts for populating the final Product name - ev itsm.png tables. These are run after prior processing. Note: You can insert a WHERE clause within the script if certain minor processing is required.
        Example documentation icon EN.png
    • Integrate data from a CSV file without the header row
      WHERE 'col0' <> 'nom'
    • Populate a final Product name - ev itsm.png table using data from a work table
INSERT INTO [EVO_BACKOFFICE].[EZV_ADMIN].[e_csv_ok]
            ([civility],[name],[start_date],[end_date],[email],[phone],[cellphone],[rsoc_div_dep],[domain],[language],[date_update],[login])
SELECT [civility],[name],[start_date],[end_date],[email],[phonefixext],[cellphone],[rsoc_div_dep],[domain],[language],Getutcdate(),login
FROM   [EVO_BACKOFFICE].[EZV_ADMIN].[e_csv_temp2]

 

The tool for running preimports (smoBackOfficeClient)

smoBackOfficeClient is a tool for running preimport tasks whenever required autonomously, without using the smoBackOffice system scheduler.

It requires the definition of command lines indicating the configuration files to be run. These can be added to a Batch or PowerShell script and run at a scheduled time using an external scheduler or third-party application.

Example

Nexthink integration enables you to integrate 48 health metrics related to equipment in Product name - ev itsm.png. As a general rule, it is run every 30 minutes. 

==>  To be able to ensure this frequency, smoBackOfficeClient is used. Integration is immediately run instead of being placed in the smoBackOffice system scheduler queue.

Notes

  • smoBackOfficeClient is shipped as an executable in the <Easyvista>\tools\ folder.
         <Easyvista>: the EasyVista folder on the application server

   Open url.png  See How to choose the best tool for automating integration.

Caution

  • When you run a task using the smoBackOfficeClient tool, it must no longer be run using the smoBackOffice system scheduler.

Best Practice big icon.pngBest practice

  • How to ensure optimal performance of the Product name - ev itsm.png platform:
    • You should reduce the number of tasks run simultaneously by smoBackOfficeClient.
    • You should schedule tasks to be run whenever possible outside platform production periods, i.e. at night, during weekends, or during periods when 24/7 platforms are accessed by the least number of users.
  • You can define a preimport task to run immediately using smoBackOfficeClient followed by immediate data integration using smoIntegration. To do this, add the two command lines to a Batch or PowerShell script.

 

How to use the smoBackOfficeClient tool

Step 1: Installing the smoBackOfficeClient tool

2. Go to the <Easyvista>\tools\smoBackOffice folder.
          <Easyvista>: The EasyVista folder on the application server

2. Duplicate the file called smoBackOffice.cfg.

3. Rename the file smoBackOfficeClient.cfg.
 

Step 2: Running the preimport task you want

1. Open a command prompt window.

2. Enter the following command line. 

  • Replace <subfolder> by the subfolder where the the .ini configuration file for the preimport task is located.
  • Replace <file.ini> by the name of the .ini configuration file for the preimport task.
smoBackOfficeClient.exe <Easyvista>\tools\<subfolder>\<file.ini>

    Example documentation icon EN.png  To run Nexthink integration

smoBackOfficeClient.exe C:\easyvista\tools\smoBackOffice\QUERIES\SQL\PREIMPORT\NEXTHINK_ASSET_STATUS.ini
Tags:
Last modified by Unknown User on 2019/04/03 11:44
Created by Administrator XWiki on 2015/04/10 11:30
Powered by XWiki ©, EasyVista 2019