Preimports
WarningSaveDatabase
Definition
A preimport is a preliminary processing run prior to external data integration in the Service Manager database. It is used to automate the processing to be performed before the integration model. This is done using SQL scripts. See the examples.
EndDefinition
- SQL scripts are listed in INI configuration files. Dedicated sections enable you to describe the processing to be performed before and after storing external data in Service Manager.
See the description of [PROCESS] sections.
- The processing is run by the Service Manager smoBackOffice scheduler based on the order in which the processing is specified in the PREIMPORT_SQL.sql file, according to the scheduling parameters defined in each INI file.
Examples
- Purge a table in Service Manager prior to external data integration.
- Normalize external data before integrating it in Service Manager, e.g. convert CPU capacity to GB instead of MB before integrating discovery data.
- Standardize external data before integrating it in Service Manager, e.g. replace the values, Hewlett P and Hewlett Packard with the value stored in the Service Manager database, i.e. HP.
- Filter external data to be integrated in Service Manager, e.g. only extract data from SCCM tables corresponding to catalog entries to be monitored in Service Manager.
See SCCM gateway.
Operating principle
Display | Steps |
---|---|
![]() |
|
Notes
- A preimport can read and process different external data sources.
- You can run a preimport via an external scheduler or third-party application using the smoBackOfficeClient tool.
See How to automate an integration.
Caution
- The scripts for creating or deleting Service Manager objects must always be run in the SQL Server database before running the preimport.
- Remember to add the = character to indicate the end of each instruction in the SQL scripts.
example
CUSTOMERNAME\LDAP_INSERT_TABLE_LDAP_CUSTOMERNAME_OK.sql=
Best Practice
- Run normalization processing on the external data if the integration is intended to update reference tables and catalogs. This avoids polluting the existing repository with duplicate values corresponding to the same reference.
example
- Replace the values, Hewlett P and Hewlett Packard with the value stored in the Service Manager database, i.e. HP
- Replace the title, Mrs. with the value stored in the titles table, i.e. Ms.
- You should run scripts that populate tables last, once you have run all of the other processing, e.g. normalization, etc.
Description of files used by the preimport
PREIMPORT_SQL.sql
- This is the general scheduler for the preimport.
- It indicates the paths of all configuration files used by the preimport and manages the processing sequence.
- Note: You must place the file in the folder called <easyvista>\tools\smoBackOffice\QUERIES\SQL\PREIMPORT\. <easyvista> identifies the EasyVista folder on your application server.
<PreimportFileName1.ini>=
<PreimportFileName2.ini>=
<PreimportFileName3.ini>=
INI configuration files
- These indicate the external data sources for the preimport with the connection parameters for databases.
- They describe the different processing to be run and their sequence.
- Note:
- You must place the files in the folder called <easyvista>\tools\smoBackOffice\QUERIES\SQL\PREIMPORT\. <easyvista> identifies the EasyVista folder on your application server.
- You must place the SQL scripts in a subfolder in the \Preimport\ folder.
; Indicate if the file must be run: ACTIVE parameter (Yes=1/No=0)
[DEFINITION]
NAME=IMPORT EXAMPLE
ACTIVE=1
; Frequency in number of days for running the file
FREQUENCY=1
; Start time for running the file
BEGINTIME=06:00:00
; Indicate if a log file must be created: LOG_DEBUG (Yes=1/No=0)
LOG_DEBUG=1
DO_NOT_DELETE_TEMPTABLE=0
; Last execution date in fraction of hours and AAAAMMDDHHMMSS format.
; It will automatically be updated when the file is next run.
;NLASTRUN=
;SLASTRUN=
; ==> List of SQL scripts to be run before storing external data in work tables
[PREPROCESS]
;script_preprocess.sql=
FCT=RESULTAT=WSDL(EVO_BACKOFFICE,50004,TEST_WSDL,ID)
[CLEAN]
;scriptclean.sql=
; ==> List of SQL scripts to be run in order to store external data in work tables
[PROCESS]
;IMPORTLDAP1=
; ==> List of SQL scripts to be run on data stored in work tables
[POSTPROCESS]
;script_postprocess.sql=
; ==> Connection parameters to the external databases
; Name the section based on the type of data source: [IMPORTLDAP], [IMPORTCSV], [IMPORTORACLE], [IMPORTADO], ...
;
; ==> LDAP preimport
[IMPORTLDAP1]
DBKERNEL=LDAP
HOSTNAME_1=MyLDAPServerIP
LOGIN_1=MyLogin
PASSWORD_1=MyPassword
DOMAIN_1=DC=staffandline,DC=local
OBJECT_1=(|(&(CN=MYPC)(objectClass=*))(&(CN=MY2PC)(objectClass=*)))
FIELDS_1=DN,ObjectGuid
TABLE_1=MyTable
UNICODE_1=0
DNUNICODE_1=0
SINGLEPASS_1=1
DO_NOT_DELETE_TEMPTABLE=1
protocol_1=3
;ROW_SEPARATOR_1=ONELARGEROWSEPARATOR
;LINE_SEPARATOR_1=ONELARGELINESEPARATOR
; ==> LDAP preimport
[IMPORTLDAP2]
DBKERNEL=LDAP2
HOSTNAME_1=1.2.3.4
PORT_1=389
LOGIN_1=LDAPAccount
PASSWORD_1=********
DOMAIN_1=DC=CUSTOMERNAME,DC=com
OBJECT_1=(&(objectcategory=person)(objectClass=user))
FIELDS_1=cn,displayName,mail,sAMAccountName,c,l,title,telephoneNumber,company,department,mobile,manager,locale,language,countryCode,DN
TABLE_1=E_LDAP_CUSTOMERNAME_TEMP
DO_NOT_DELETE_TEMPTABLE=1
ORGANISATION_FILTER_1=(OU=*)
PROTOCOL_1=3
UNICODE_1=1
DNUNICODE_1=0
SinglePass_1=1
; ==> ADO preimport
[IMPORTADO1]
DBKERNEL=ADO
CONNECTIONSTRING_1=Provider=sqloledb;Data Source=0627-sf\OFFICIEL;Initial Catalog=EVO_ADMIN;User Id=EZV_ADMIN;Password=staff;
TABLE_1=MyTable
REQUEST_1=MyRequest.sql
DO_NOT_DELETE_TEMPTABLE=1
ROW_SEPARATOR_1=ONELARGEROWSEPARATOR
LINE_SEPARATOR_1=ONELARGELINESEPARATOR
; ==> Oracle preimport
[IMPORTORACLE1]
DBKERNEL=ORACLE
HOST=OBELIX
LOGIN=
PASSWORD=
DATABASE=
TABLE_1=
REQUEST_1=MyRequest.sql
TABLE_2=
REQUEST_2=MyRequest.sql
DO_NOT_DELETE_TEMPTABLE=1
; ==> CSV preimport
[IMPORTCSV1]
DBKERNEL=CSV
TABLE_1=TABLE_CSV_TEMP
DIRECTORY_1=C:\easyvista\tools\smoBackOffice\QUERIES\SQL\PREIMPORT\
EXTENTION_1=MyFile.csv
SEPARATOR_1=;
DO_NOT_DELETE_TEMP_TABLE=1
[PREPROCESS] section
This section lists the SQL scripts to be run before storing external data in Service Manager work tables. These are usually queries for purging tables.
example
- Purge the myTable table
Configuration file Details of the SQL script [PREPROCESS]
DELETE_TABLE.sql=DELETE FROM myTable
- Run scripts to delete data found in the CUSTOMERNAME folder
Configuration file Details of the SQL script [PREPROCESS]
CUSTOMERNAME\LDAP_TRUNCATE_TABLE_LDAP_CUSTOMERNAME_TEMP.sql=
CUSTOMERNAME\LDAP_TRUNCATE_TABLE_LDAP_CUSTOMERNAME_OK.sql=
[PROCESS] section
This section lists the SQL scripts to be run in order to store external data in Service Manager work tables.
- The scripts vary depending on the data source.
- They can be used to populate new Service Manager work tables.
example Run a script to inject data found in the CUSTOMERNAME folder
Configuration file | Details of the SQL script | |
---|---|---|
[PROCESS] CUSTOMERNAME\LDAP_INSERT_TABLE_LDAP_CUSTOMERNAME_OK.sql= |
[POSTPROCESS] section
This section lists the SQL scripts to be run on data stored in Service Manager work tables. These can be standard scripts, custom scripts defined by customers, normalization scripts or scripts for populating tables.
Note: You should run scripts that populate tables last.
- Standard Service Manager scripts for standardizing data
example Run a script to convert all dates to the format used in Service Manager, and store the standardized data in a new work table
Configuration file | Details of the SQL script | |
---|---|---|
[POSTPROCESS] INSERT_TABLE_LDAP_OK.sql= |
INSERT INTO [EVO_BACKOFFICE].ezv_admin.e_csv_temp2 ([qualite],lastname,firstname,[directory],[start_date],[end_date],[email],[phonefixext], [telcom],[cellphone],[sitecode],[desk],[section],[subsidiary],[division],[FolderN1],[department],[login]) SELECT Isnull(Rtrim(Upper([civility])), ''), Isnull(Upper([firstname]), ''), Isnull(Lower([lastname]), ''), Isnull([directory], ''), Cast((Substring(start_date, 7, 4)+'-'+Substring(start_date, 4, 2)+'-'+Substring(start_date, 1, 2)+' 00:00:00.000') AS DATETIME), Cast((Substring(end_date, 7, 4)+'-'+Substring(end_date, 4, 2)+'-'+Substring(end_date, 1, 2)+' 00:00:00.000') AS DATETIME) AS date_fin FROM [EVO_BACKOFFICE].ezv_admin.[e_csv_temp] WHERE firstname NOT LIKE 'TEST%' AND directory NOT LIKE 'RET%' AND sitecode NOT LIKE '%--%'-- |
- Custom scripts for normalizing data, specific to each customer.
example
- Store values in GB instead of MB
- Rebuild the location using three customer fields
- Number the graphic cards for each workstation
- Normalize the titles in a work table
Configuration file Details of the SQL script [POSTPROCESS]
UPDATE_TABLE_LDAP_OK.sql=UPDATE [EVO_BACKOFFICE].ezv_admin.e_csv_temp
SET civilite = CASE qualite
WHEN 'Sir' THEN 'Mr.'
WHEN 'sir' THEN 'Mr.'
WHEN 'Madam'THEN 'Ms.'
WHEN 'Miss' THEN 'Ms.'
WHEN 'Mrs' THEN 'Ms.'
ELSE qualite
END
FROM [EVO_BACKOFFICE].ezv_admin.e_csv_temp
- Data normalization scripts for integration gateways, specific to each customer
example Monitor all SCCM names for the versions of a given software grouped in a single catalog entry
- Scripts for populating final Service Manager tables.
example Populate a final Service Manager table using a work table
Configuration file Details of the SQL script [POSTPROCESS]
INSERT_TABLE_LDAP_OK.sql=INSERT INTO [EVO_BACKOFFICE].[EZV_ADMIN].[e_csv_ok]
([civility],[name],[start_date],[end_date],[email],[phone],
[cellphone],[rsoc_div_dep],[domain],[language],[date_update],[login])
SELECT [civility],[name],[start_date],[end_date],[email],[phonefixext],
[cellphone],[rsoc_div_dep],[domain],[language],Getutcdate(),login
FROM [EVO_BACKOFFICE].[EZV_ADMIN].[e_csv_temp2]
You can insert a subquery within a script using the WHERE clause.
example Integrate CSV data without the header row
Procedure: How to run a preimport using the Service Manager smoBackOffice scheduler
See How to automate an integration to run a preimport using an external scheduler or third-party application.
1. Specify the INI files to be run by the smoBackOffice scheduler.
- Open the PREIMPORT_SQL.sql file found in <easyvista>\tools\smobackoffice\QUERIES\SQL\PREIMPORT.
- In the [PREIMPORT] section, specify the list of INI configuration files of the SQL scripts to be run, based on the order in which they should be run.
2. Define the scheduling parameters in the [DEFINITION] section of the INI configuration files.
- Open each INI configuration file.
- Use the ACTIVE key to indicate that the file must be run.
- Use the FREQUENCY key to define the frequency (in number of days) for running the file.
- Use the BEGINTIME key to specify the start time (in HH:NN:SS format) for running the file.
- Delete the two keys, NLASTRUN and SLASTRUN that store the last execution date of the file. This will force the execution of the file.
The processing in the SQL scripts will be run when the preimport is next executed.
Specific use case
- Preimport for integrating an LDAP directory containing employee data.
See Implementation.