Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Sample Data

## Overview
This repo contains data that is uploaded into MOSIP during [sandbox installation](https://docs.mosip.io/1.2.0/deployment/sandbox-deployment). The data needs to be reviewed and modified for a country specific deployment. Refer to [Masterdata Guide](https://docs.mosip.io/1.2.0/deployment/masterdata-guide).
This repo contains data that is uploaded into MOSIP during [sandbox installation](https://docs.mosip.io/1.2.0/setup/deploymentnew/getting-started#mosip-installations). The data needs to be reviewed and modified for a country specific deployment. Refer to [Masterdata Guide](https://docs.mosip.io/1.2.0/id-lifecycle-management/support-systems/administration/masterdata-guide).

## For Build and Run
Data initialization is performed through the **Master Data Loader** as part of the [postgres-init](https://github.com/mosip/postgres-init/tree/release-1.3.x) repository.
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
46 changes: 46 additions & 0 deletions mosip_master/data_upgrade/1.1.5.5_to_1.2.0.1/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
## Migrating country specific data from 1.1.5.5 to 1.2.0.1 version

Prerequisites:
-> SQL migration must be successfully executed.
-> Go to upgrade.properties file and modify the property values as per the environment.

Note: List of the commands executed during the data upgrade can be found in upgrade_commands.txt. One command per line. Commands using data-uploader.py script cannot be executed after it is successfully executed once. It should be commented for the next execution(upgrade.sh ignores the commented lines).


1. Migration of dynamic field table data.
Dynamic value was stored as jsonarray in version 1.1.5*, now in 1.2.0.1 we store it as json object. one entry for each language of the field.

Script takes the backup of existing table and migrates dynamic field table data into new table created.

2. UI Spec migration

In 1.1.5* both Identity schema and UI spec was stored in identity_schema table. From 1.2.0 it is split into 2 different tables, identity_schema and ui_spec. As part of sql upgrade script data split is taken care.

Here, we take care of migrating old UI spec to new UI spec.
Ref: https://docs.mosip.io/1.2.0/modules/registration-client/registration-client-ui-specifications
-> It is recommended to verify the validators and visibility expressions in migrated UI SPEC.
-> After this migration, one old ui spec will be divided into 3 different ui spec like "newProcess", "updateProcess" and "lostProcess".
-> AGEGROUP_CONFIG in the upgrade.properties should be updated based on the age-group values defined in the property name "mosip.regproc.packet.classifier.tagging.agegroup.ranges" in registration-default.properties file.


Refer below API documentation to define and publish UI spec

https://mosip.github.io/documentation/1.2.0/kernel-masterdata-service.html#operation/defineUISpec
https://mosip.github.io/documentation/1.2.0/kernel-masterdata-service.html#operation/publishUISpec

3. Template type and Template data change:

New template types and templates were introduced in 1.2.0.1. All the new types and templates itself are provided in the xlsx file in the same directoy in english, arabic, french, kannada, hindi and tamil languages.

1. "id" column in the templates excel sheet is autogenerated before upload to server.
2. Make sure to remove unsupported languages from the excel files before starting the migration.
3. Cross check if all the language specific data is valid and correct w.r.t the language.Make the change as required in the template text (file_text column) if required.

Note: We have introduced "label" and "value" in registration client acknowledgment and preview templates. Data in all the captured languages are slash separated and is provided to the template with "label" and "value" keys. So instead of "primaryLabel" and "primaryValue", "secondaryLabel" and "secondaryValue" use "label" and "value". To be backward compatibile, we still provide support for "primaryLabel" and "primaryValue" in 1.2.0.1.

4. Machine Type, Machine Specification & Zone User mapping:

Resident service is introduced as new machine type and corresponding machine specification is added.
And most importantly, Resident service client is mapped to the top most zone in the zone heirarchy (country code).

Note: In zone_user_delta.xlsx, resident service client is mapped to a dummy zone "MOR". Before execution of the upgrade.sh script, update the zone_code to appropriate value and save the changes.
162 changes: 162 additions & 0 deletions mosip_master/data_upgrade/1.1.5.5_to_1.2.0.1/data-uploader.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,162 @@
# -*- coding: utf-8 -*-

#!/usr/bin/python3


## This script should be executed after DB upgrade and 1.2.0.* masterdata-service deployment

from datetime import datetime, timezone, timedelta
import argparse
import requests
import json
import sys
import time
import psycopg2
import openpyxl

parser = argparse.ArgumentParser(description='This is CSV/xlsx file uploader script.invokes 1.2.0.1 bulk upload endpoints')
parser.add_argument("--domain", type=str, required=True, help="Server domain name, eg: api-internal.dev.mosip.net")
parser.add_argument("--username", type=str, required=True, help="User with GLOBAL_ADMIN & REGISTRATION_ADMIN role")
parser.add_argument("--password", type=str, required=True, help="User password")
parser.add_argument("--table", type=str, required=True, help="Database table name")
parser.add_argument("--operation", type=str, required=True, help="Database operation, eg: Insert or Update or Delete")
parser.add_argument("--file", type=str, required=True, help="Input file CSV or xlsx")
parser.add_argument("--autogen", choices=(1,0), default=0, type=int, required=False, help="Autogenerate value for id column")
parser.add_argument("--idcolumn", type=str, required=False, help="id column name, eg: A or B ...")
parser.add_argument("--sheetname", type=str, required=False, help="Sheet name to operate")
parser.add_argument("--dbusername", type=str, required=False, help="DB username")
parser.add_argument("--dbpassword", type=str, required=False, help="DB username")
parser.add_argument("--dbhost", type=str, required=False, help="DB hostname")
parser.add_argument("--dbport", type=str, required=False, help="DB port number")

args = parser.parse_args()

## Values to be updated as per the deployment
authURL='https://'+args.domain+'/v1/authmanager/authenticate/useridPwd'
uploadURL='https://'+args.domain+'/v1/admin/bulkupload'
uploadStatusURL='https://'+args.domain+'/v1/admin/bulkupload/transcation/'
username=args.username
password=args.password

def getCurrentDateTime():
dt_now = datetime.now(timezone.utc)
dt_now_str = dt_now.strftime('%Y-%m-%dT%H:%M:%S.%f')[:-3]
return dt_now_str+'Z'


def get_seed_value():
conn = psycopg2.connect(database="mosip_master", user = args.dbusername, password = args.dbpassword, host = args.dbhost, port = args.dbport)
cursor = conn.cursor()
cursor.execute("select id from master."+args.table+" order by id desc limit 20")
for row in cursor.fetchall():
id_value = row[0]
if id_value is None:
seed_value = 1000
break
if id_value.isdigit():
seed_value = id_value
break;

if seed_value == None:
seed_value = 1000
return seed_value


def find_last_data_row(sheet):
max_row = sheet.max_row

for row in range(max_row, 0, -1):
for cell in sheet[row]:
if cell.value is not None:
return row

def fill_series():
if args.sheetname == None:
print("Sheet name is required to fill series in id column.")
exit(1)

if args.idcolumn == None:
print("id column name is required to fill series.")
exit(1)

seed_value = get_seed_value()

print("Sheet name: ",args.sheetname)
print("Id column to fill series: ", args.idcolumn)
print("Seed value: ", seed_value)

workbook = openpyxl.load_workbook(args.file)
sheet = workbook[args.sheetname]
column = sheet[args.idcolumn]

start_row = 2
end_row = find_last_data_row(sheet)

print("Start Row: ", start_row)
print("End Row: ", end_row)

if(start_row is None and end_row is None):
print("Need a valid start_row and end_row!")
return

for i, value in enumerate(range(start_row, end_row + 1), start=1):
column[i].value = int(seed_value) + value

workbook.save(args.file)
workbook.close()



def getAccessToken():
auth_req_data = {
'id': 'string',
'metadata': {},
'request': {
'appId': 'admin',
'password': password,
'userName': username
},
'requesttime': getCurrentDateTime(),
'version': 'string'
}
authresponse=requests.post(authURL, json= auth_req_data)
print(json.dumps(authresponse.json()))
return authresponse.headers["authorization"]



def uploadFile():
if args.autogen == 1 :
fill_series()

data = {'category': 'masterdata', 'operation': args.operation, 'tableName': args.table}
files = {'files': open(args.file, 'rb')}
uploadResponse = requests.post(uploadURL, data=data, files=files, headers=req_headers, verify=True)
uploadResponse_json = uploadResponse.json()
response = uploadResponse_json['response']
print(json.dumps(uploadResponse_json))
return response['transcationId']


def getTransactionStatus(transactionId):
statusResponse = requests.get(uploadStatusURL+transactionId, headers=req_headers, verify=True)
statusResponse_json = statusResponse.json()
response = statusResponse_json['response']
return response


req_headers={'Cookie' : 'Authorization='+getAccessToken()}
transactionId = uploadFile()
while True:
time.sleep(5) ## sleep for 5 seconds
status_response = getTransactionStatus(transactionId)
print(json.dumps(status_response))
status = status_response["status"]
if status == "COMPLETED":
break
if status == "FAILED":
sys.exit("Transcation failed")




Binary file not shown.
Binary file not shown.
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
#!/usr/bin/python3

import psycopg2
import json
import sys

conn = psycopg2.connect(database="mosip_master", user = sys.argv[1], password = sys.argv[2], host = sys.argv[3], port = sys.argv[4])

print("Opened database successfully")

cur = conn.cursor()

#Backup existing dynamic_field table
cur.execute('ALTER TABLE master.dynamic_field RENAME TO dynamic_field_migr_bkp;')

print("Renamed dynamic_field table to dynamic_field_migr_bkp")

#Create dynamic_field table
cur.execute('''CREATE TABLE master.dynamic_field(
id character varying(36) NOT NULL,
name character varying(36) NOT NULL,
description character varying(256),
data_type character varying(16),
value_json character varying,
lang_code character varying(3) NOT NULL,
is_active boolean NOT NULL,
cr_by character varying(256) NOT NULL,
cr_dtimes timestamp NOT NULL,
upd_by character varying(256),
upd_dtimes timestamp,
is_deleted boolean DEFAULT FALSE,
del_dtimes timestamp,
CONSTRAINT pk_dynamic_id PRIMARY KEY (id));''')

print("created table dynamic_field")


cur.execute('GRANT SELECT,INSERT,UPDATE,DELETE,TRUNCATE,REFERENCES ON master.dynamic_field TO masteruser;')
print("Applied grant on dynamic_field")

#Query all the records from backup table
cur.execute('select * from master.dynamic_field_migr_bkp;')
rows = cur.fetchall()

print("Data fetched from backup table")

list_entities = []

#Iterate through each row and create new insert statements
for row in rows:
values = json.loads(row[4])
for val in values:
vmap = {'code' : val['code'], 'value': val['value']}
list_entities.append(json.dumps({"name": row[1], "langCode" : val['langCode'], "value_json": json.dumps(vmap), "is_active": row[6]}))


#Query all the records from gender table
cur.execute('select * from master.gender;')
gender_rows = cur.fetchall()
for row in gender_rows:
vmap = {'code' : row[0], 'value': row[1]}
list_entities.append(json.dumps({"name": sys.argv[5], "langCode" : row[2],"value_json": json.dumps(vmap), "is_active": row[3]}))


#Query all the records from individual_type table
cur.execute('select * from master.individual_type;')
individual_type_rows = cur.fetchall()
for row in individual_type_rows:
vmap = {'code' : row[0], 'value': row[1]}
list_entities.append(json.dumps({"name": sys.argv[6], "langCode" : row[2],"value_json": json.dumps(vmap), "is_active": row[3]}))


id = 1000
stmt = 'insert into dynamic_field values (%s,%s,%s,%s,%s,%s,%s,%s,now(),NULL,NULL,False,NULL);'
unique_entities = set(list_entities)
for entity_str in unique_entities:
id = id + 1
entity = json.loads(entity_str)
status = False
if(entity['is_active'] == True):
status = True
#Execute the insert statement
cur.execute(stmt, (str(id), entity['name'], entity['name'], 'string', entity['value_json'], entity['langCode'], status, 'migration-script'))


# Commit and close connection
conn.commit()

print("Closing the database connection")
conn.close()
Loading