-
Notifications
You must be signed in to change notification settings - Fork 8
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #1008 from dondi/beta
v6.0.4
Showing
32 changed files
with
1,711 additions
and
788 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -44,6 +44,8 @@ rules: | |
- error | ||
brace-style: | ||
- error | ||
- 1tbs | ||
- allowSingleLine: true | ||
comma-spacing: | ||
- error | ||
max-len: | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,87 @@ | ||
Here are the files pertaining to both the network and expression databases. Look within the README.md files of both folders for information pertinent to the schema that you intend to be using. | ||
# GRNsight Database | ||
Here are the files pertaining to both the network and expression databases. Look within the README.md files of both folders for information pertinent to the schema that you intend to be using. | ||
## Setting up a local postgres GRNsight Database | ||
1. Installing PostgreSQL on your computer | ||
- MacOS and Windows can follow these [instructions](https://dondi.lmu.build/share/db/postgresql-setup-day.pdf) on how to install postgreSQL. | ||
- Step 1 tells you how to install postgreSQL on your local machine, initialize a database, and how to start and stop running your database instance. | ||
- If your terminal emits a message that looks like `initdb --locale=C -E UTF-8 location-of-cluster` from Step 1B, then your installer has initialized a database for you. | ||
- Additionally, your installer may start the server for you upon installation. To start the server yourself run `pg_ctl start -D location-of-cluster`. To stop the server run `pg_ctl stop -D location-of-cluster`. | ||
- Linux users | ||
- The MacOS and Windows instructions will _probably_ not work for you. You can try at your own risk to check. | ||
- Linux users can try these [instructions](https://www.geeksforgeeks.org/install-postgresql-on-linux/) and that should work for you (...maybe...). If it doesn't try googling instructions with your specific operating system. Sorry! | ||
2. Loading data to your database | ||
1. Adding the Schemas to your database. | ||
1. Go into your database using the following command: | ||
|
||
``` | ||
psql postgresql://localhost/postgres | ||
``` | ||
From there, create the schemas using the following commands: | ||
``` | ||
CREATE SCHEMA spring2022_network; | ||
``` | ||
``` | ||
CREATE SCHEMA fall2021; | ||
``` | ||
Once they are created you can exit your database using the command `\q`. | ||
2. Once your schema's are created, you can add the table specifications using the following commands: | ||
``` | ||
psql postgresql://localhost/postgres -f <path to GRNsight/database/network-database>/schema.sql | ||
``` | ||
``` | ||
psql postgresql://localhost/postgres -f <path to GRNsight/database/expression-database>/schema.sql | ||
``` | ||
Your database is now ready to accept expression and network data! | ||
2. Loading the GRNsight Network Data to your local database | ||
1. GRNsight generates Network Data from SGD through YeastMine. In order to run the script that generates these Network files, you must pip3 install the dependencies used. If you get an error saying that a module doesn't exist, just run `pip3 install <Module Name>` and it should fix the error. If the error persists and is found in a specific file on your machine, you might have to manually go into that file and alter the naming conventions of the dependencies that are used. _Note: So far this issue has only occured on Ubuntu 22.04.1, so you might be lucky and not have to do it!_ | ||
``` | ||
pip3 install pandas requests intermine tzlocal | ||
``` | ||
Once the dependencies have been installed, you can run | ||
``` | ||
python3 <path to GRNsight/database/network-database/scripts>/generate_network.py | ||
``` | ||
This will take a while to get all of the network data and generate all of the files. This will create a folder full of the processed files in `database/network-database/script-results`. | ||
2. Load the processed files into your database. | ||
``` | ||
python3 <path to GRNsight/database/network-database/scripts>/loader.py | psql postgresql://localhost/postgres | ||
``` | ||
This should output a bunch of COPY print statements to your terminal. Once complete your database is now loaded with the network data. | ||
3. Loading the GRNsight Expression Data to your local database | ||
1. Create a directory (aka folder) in the database/expression-database folder called `source-files`. | ||
``` | ||
mkdir <path to GRNsight/database/expression-database>/source-files | ||
``` | ||
2. Download the _"Expression 2020"_ folder from Box located in `GRNsight > GRNsight Expression > Expression 2020` to your newly created `source-files` folder | ||
3. Run the pre-processing script on the data. This will create a folder full of the processed files in `database/expression-database/script-results`. | ||
``` | ||
python3 <path to GRNsight/database/expression-database/scripts>/preprocessing.py | ||
``` | ||
4. Load the processed files into your database. | ||
``` | ||
python3 <path to GRNsight/database/expression-database/scripts>/loader.py | psql postgresql://localhost/postgres | ||
``` | ||
This should output a bunch of COPY print statements to your terminal. Once complete your database is now loaded with the expression data. | ||
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,60 @@ | ||
# Expression Database | ||
|
||
All files pertaining the expression database live within this directory. | ||
|
||
## The basics | ||
|
||
#### Schema | ||
|
||
All network data is stored within the fall2021 schema on our Postgres database. | ||
|
||
The schema is located within this directory at the top level in the file `schema.sql`. It defines the tables located within the fall2021 schema. | ||
|
||
Usage: | ||
To load to local database | ||
``` | ||
psql postgresql://localhost/postgres -f schema.sql | ||
``` | ||
To load to production database | ||
``` | ||
psql <address to database> -f schema.sql | ||
``` | ||
|
||
### Scripts | ||
|
||
All scripts live within the subdirectory `scripts`, located in the top-level of the network database directory. | ||
|
||
Any source files required to run the scripts live within the subdirectory `source-files`, located in the top-level of the network database directory. As source files may be large, you must create this directory yourself and add any source files you need to use there. | ||
|
||
All generated results of the scripts live in the subdirectory `script-results`, located in the top-level of the network database directory. Currently, all scripts that generate code create the directory if it does not currently exist. When adding a new script that generates resulting code, best practice is to create the script-results directory and any subdirectories if it does not exist, in order to prevent errors and snafus for recently cloned repositories. | ||
|
||
Within the scripts directory, there are the following files: | ||
|
||
- `preprocessing.py` | ||
- `loader.py` | ||
|
||
#### Data Preprocessor(s) | ||
*Note: Data Preprocessing is always specific to each dataset that you obtain. `preprocessing.py` is capable of preprocessing the specific Expression data files located in `source-files/Expression 2020`. Because these files are too large to be stored on github, access the direct source files on BOX and move them into this directory. If more data sources are to be added in the database, create a new directory in source-files for it, note it in this `README.md` file and create a new preprocessing script for that data source (if required). Please document the changes in this section so that future developers may use your work to recreate the database if ever required.* | ||
|
||
* The script (`preprocessing.py`) is used to preprocess the data in `source-files/Expression 2020`. It parses through each file to construct the processed loader files, so that they are ready to load using `loader.py`. Please read through the code, as there are instructions on what to add within the comments. Good luck! | ||
* The resulting processed loader files are located in `script-results/processed-expression` and the resulting processed loader files are located within `script-results/processed-loader-files` | ||
|
||
Usage: | ||
``` | ||
python3 preprocessing.py | ||
``` | ||
#### Database Loader | ||
This script (`loader.py`) is to be used to load your preprocessed expression data into the database. | ||
This program generates direct SQL statements from the source files generated by the data preprocessor in order to populate a relational database with those files’ data | ||
Usage: | ||
To load to local database | ||
``` | ||
python3 loader.py | psql postgresql://localhost/postgres | ||
``` | ||
To load to production database | ||
``` | ||
python3 loader.py | psql <path to database> | ||
``` |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,71 @@ | ||
CREATE TABLE fall2021.ref ( | ||
pubmed_id VARCHAR, | ||
authors VARCHAR, | ||
publication_year VARCHAR, | ||
title VARCHAR, | ||
doi VARCHAR, | ||
ncbi_geo_id VARCHAR, | ||
PRIMARY KEY(ncbi_geo_id, pubmed_id) | ||
); | ||
|
||
CREATE TABLE fall2021.gene ( | ||
gene_id VARCHAR, -- systematic like name | ||
display_gene_id VARCHAR, -- standard like name | ||
species VARCHAR, | ||
taxon_id VARCHAR, | ||
PRIMARY KEY(gene_id, taxon_id) | ||
); | ||
|
||
CREATE TABLE fall2021.expression_metadata ( | ||
ncbi_geo_id VARCHAR, | ||
pubmed_id VARCHAR, | ||
FOREIGN KEY (ncbi_geo_id, pubmed_id) REFERENCES fall2021.ref(ncbi_geo_id, pubmed_id), | ||
control_yeast_strain VARCHAR, | ||
treatment_yeast_strain VARCHAR, | ||
control VARCHAR, | ||
treatment VARCHAR, | ||
concentration_value FLOAT, | ||
concentration_unit VARCHAR, | ||
time_value FLOAT, | ||
time_unit VARCHAR, | ||
number_of_replicates INT, | ||
expression_table VARCHAR, | ||
display_expression_table VARCHAR, | ||
PRIMARY KEY(ncbi_geo_id, pubmed_id, time_value) | ||
); | ||
CREATE TABLE fall2021.expression ( | ||
gene_id VARCHAR, | ||
taxon_id VARCHAR, | ||
FOREIGN KEY (gene_id, taxon_id) REFERENCES fall2021.gene(gene_id, taxon_id), | ||
-- ncbi_geo_id VARCHAR, | ||
-- pubmed_id VARCHAR, | ||
sort_index INT, | ||
sample_id VARCHAR, | ||
expression FLOAT, | ||
time_point FLOAT, | ||
dataset VARCHAR, | ||
PRIMARY KEY(gene_id, sample_id) | ||
-- FOREIGN KEY (ncbi_geo_id, pubmed_id, time_point) REFERENCES fall2021.expression_metadata(ncbi_geo_id, pubmed_id, time_value) | ||
); | ||
CREATE TABLE fall2021.degradation_rate ( | ||
gene_id VARCHAR, | ||
taxon_id VARCHAR, | ||
FOREIGN KEY (gene_id, taxon_id) REFERENCES fall2021.gene(gene_id, taxon_id), | ||
ncbi_geo_id VARCHAR, | ||
pubmed_id VARCHAR, | ||
FOREIGN KEY (ncbi_geo_id, pubmed_id) REFERENCES fall2021.ref(ncbi_geo_id, pubmed_id), | ||
PRIMARY KEY(gene_id, ncbi_geo_id, pubmed_id), | ||
degradation_rate FLOAT | ||
); | ||
|
||
CREATE TABLE fall2021.production_rate ( | ||
gene_id VARCHAR, | ||
taxon_id VARCHAR, | ||
FOREIGN KEY (gene_id, taxon_id) REFERENCES fall2021.gene(gene_id, taxon_id), | ||
ncbi_geo_id VARCHAR, | ||
pubmed_id VARCHAR, | ||
FOREIGN KEY (ncbi_geo_id, pubmed_id) REFERENCES fall2021.ref(ncbi_geo_id, pubmed_id), | ||
PRIMARY KEY(gene_id, ncbi_geo_id, pubmed_id), | ||
production_rate FLOAT | ||
-- FOREIGN KEY (gene_id, ncbi_geo_id, pubmed_id) REFERENCES fall2021.degradation_rate(gene_id, ncbi_geo_id, pubmed_id) -- not sure if we want to link the generated production rate to it's original degradation rate | ||
); |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,186 @@ | ||
import csv | ||
import re | ||
# Usage | ||
# python3 loader.py | psql postgresql://localhost/postgres | ||
""" | ||
This program generates direct SQL statements from the source files in order | ||
to populate a relational database with those files’ data. | ||
By taking the approach of emitting SQL statements directly, we bypass the need to import | ||
some kind of database library for the loading process, instead passing the statements | ||
directly into a database command line utility such as `psql`. | ||
""" | ||
|
||
""" | ||
Stolen from https://www.kite.com/python/answers/how-to-check-if-a-string-is-a-valid-float-in-python | ||
""" | ||
def check_float(potential_float): | ||
try: | ||
float(potential_float) | ||
return True | ||
except ValueError: | ||
return False | ||
""" | ||
Inspired by https://www.kite.com/python/answers/how-to-check-if-a-string-is-a-valid-float-in-python | ||
""" | ||
def check_int(potential_int): | ||
try: | ||
int(potential_int) | ||
return True | ||
except ValueError: | ||
return False | ||
""" | ||
Created out of necessity | ||
""" | ||
def convert_float(potential_float): | ||
return float("".join(potential_float.split()).replace(" ", "")) if "".join(potential_float.split()).replace(" ", "") else -0.000000000001 | ||
""" | ||
Created out of necessity | ||
""" | ||
def convert_int(potential_int): | ||
return int("".join(potential_int.split()).replace(" ", "")) if check_int("".join(potential_int.split()).replace(" ", "")) else -1111111 | ||
|
||
|
||
""" | ||
This program Loads Refs into the database | ||
""" | ||
def LOAD_REFS(): | ||
print('COPY fall2021.ref (pubmed_id, authors, publication_year, title, doi, ncbi_geo_id) FROM stdin;') | ||
REFS_SOURCE = '../script-results/processed-expression/refs.csv' | ||
with open(REFS_SOURCE, 'r+') as f: | ||
reader = csv.reader(f) | ||
row_num = 0 | ||
for row in reader: | ||
if row_num != 0: | ||
r= ','.join(row).split('\t') | ||
pubmed_id = r[0] | ||
authors = r[1] | ||
publication_year = r[2] | ||
title = r[3] | ||
doi = r[4] | ||
ncbi_geo_id = r[5] | ||
print(f'{pubmed_id}\t{authors}\t{publication_year}\t{title}\t{doi}\t{ncbi_geo_id}') | ||
row_num += 1 | ||
print('\\.') | ||
|
||
""" | ||
This program Loads ID Mapping into the database | ||
""" | ||
def LOAD_GENES(): | ||
print('COPY fall2021.gene (gene_id, display_gene_id, species, taxon_id) FROM stdin;') | ||
GENE_SOURCE = '../script-results/processed-expression/genes.csv' | ||
with open(GENE_SOURCE, 'r+') as f: | ||
reader = csv.reader(f) | ||
row_num = 0 | ||
for row in reader: | ||
if row_num != 0: | ||
r= ','.join(row).split('\t') | ||
gene_id = r[0] | ||
display_gene_id= r[1] | ||
species = r[2] | ||
taxon_id = r[3] | ||
print(f'{gene_id}\t{display_gene_id}\t{species}\t{taxon_id}') | ||
row_num += 1 | ||
print('\\.') | ||
|
||
""" | ||
This program Loads Expression Metadata into the database | ||
""" | ||
def LOAD_EXPRESSION_METADATA(): | ||
print('COPY fall2021.expression_metadata (ncbi_geo_id, pubmed_id, control_yeast_strain, treatment_yeast_strain, control, treatment, concentration_value, concentration_unit, time_value, time_unit, number_of_replicates, expression_table) FROM stdin;') | ||
EXPRESSION_METADATA_SOURCE = '../script-results/processed-expression/expression-metadata.csv' | ||
with open(EXPRESSION_METADATA_SOURCE, 'r+') as f: | ||
reader = csv.reader(f) | ||
row_num = 0 | ||
for row in reader: | ||
if row_num != 0: | ||
r= ','.join(row).split('\t') | ||
ncbi_geo_id = r[0] | ||
pubmed_id =r[1] | ||
control_yeast_strain = r[2] | ||
treatment_yeast_strain = r[3] | ||
control = r[4] | ||
treatment = r[5] | ||
concentration_value = float(r[6]) | ||
concentration_unit = r[7] | ||
time_value = float(r[8]) | ||
time_unit = r[9] | ||
number_of_replicates = int(r[10]) | ||
expression_table = r[11] | ||
|
||
print(f'{ncbi_geo_id}\t{pubmed_id}\t{control_yeast_strain}\t{treatment_yeast_strain}\t{control}\t{treatment}\t{concentration_value}\t{concentration_unit}\t{time_value}\t{time_unit}\t{number_of_replicates}\t{expression_table}') | ||
row_num += 1 | ||
print('\\.') | ||
|
||
""" | ||
This program Loads Expression Data into the database | ||
""" | ||
def LOAD_EXPRESSION_DATA(): | ||
print('COPY fall2021.expression (gene_id, taxon_id, sort_index, sample_id, expression, time_point, dataset) FROM stdin;') | ||
EXPRESSION_DATA_SOURCE = '../script-results/processed-expression/expression-data.csv' | ||
with open(EXPRESSION_DATA_SOURCE, 'r+') as f: | ||
reader = csv.reader(f) | ||
row_num = 0 | ||
for row in reader: | ||
if row_num != 0: | ||
r= ','.join(row).split('\t') | ||
gene_id = r[0] | ||
taxon_id = r[1] | ||
sort_index = int(r[2]) | ||
sample_id = r[3] | ||
expression = float(r[4]) if r[4] != "" else "NaN" | ||
|
||
time_point = float(r[5]) | ||
data_set = r[6] | ||
print(f'{gene_id}\t{taxon_id}\t{sort_index}\t{sample_id}\t{expression}\t{time_point}\t{data_set}') | ||
row_num += 1 | ||
print('\\.') | ||
|
||
""" | ||
This program Loads Production Rates into the database | ||
""" | ||
def LOAD_PRODUCTION_RATES(): | ||
print('COPY fall2021.production_rate (gene_id, taxon_id, ncbi_geo_id, pubmed_id, production_rate) FROM stdin;') | ||
PRODUCTION_RATES_SOURCE = '../script-results/processed-expression/production-rates.csv' | ||
with open(PRODUCTION_RATES_SOURCE, 'r+') as f: | ||
reader = csv.reader(f) | ||
row_num = 0 | ||
for row in reader: | ||
if row_num != 0: | ||
r= ','.join(row).split('\t') | ||
gene_id = r[0] | ||
taxon_id = r[1] | ||
ncbi_geo_id = r[2] | ||
pubmed_id = r[3] | ||
production_rate = float(r[4]) if r[4] != "" else "NaN" | ||
print(f'{gene_id}\t{taxon_id}\t{ncbi_geo_id}\t{pubmed_id}\t{production_rate}') | ||
row_num += 1 | ||
print('\\.') | ||
|
||
""" | ||
This program Loads Degradation Rates into the database | ||
""" | ||
def LOAD_DEGRADATION_RATES(): | ||
print('COPY fall2021.degradation_rate (gene_id, taxon_id, ncbi_geo_id, pubmed_id, degradation_rate) FROM stdin;') | ||
DEGRADATION_RATES_SOURCE = '../script-results/processed-expression/degradation-rates.csv' | ||
with open(DEGRADATION_RATES_SOURCE, 'r+') as f: | ||
reader = csv.reader(f) | ||
row_num = 0 | ||
for row in reader: | ||
if row_num != 0: | ||
r= ','.join(row).split('\t') | ||
gene_id = r[0] | ||
taxon_id = r[1] | ||
ncbi_geo_id = r[2] | ||
pubmed_id = r[3] | ||
degradation_rate = float(r[4]) if r[4] != "" else "NaN" | ||
print(f'{gene_id}\t{taxon_id}\t{ncbi_geo_id}\t{pubmed_id}\t{degradation_rate}') | ||
row_num += 1 | ||
print('\\.') | ||
|
||
LOAD_REFS() | ||
LOAD_GENES() | ||
LOAD_EXPRESSION_METADATA() | ||
LOAD_EXPRESSION_DATA() | ||
LOAD_PRODUCTION_RATES() | ||
LOAD_DEGRADATION_RATES() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,197 @@ | ||
import csv | ||
import re | ||
import sys | ||
import os | ||
|
||
# Need to manually add Dahlquist data to Expression metadata and refs | ||
|
||
|
||
species = "Saccharomyces cerevisiae" | ||
taxon_id = "559292" | ||
|
||
# Gene Id Generation and Expression Data Generation | ||
|
||
# Create folder paths | ||
if not os.path.exists('../script-results'): | ||
os.makedirs('../script-results') | ||
|
||
if not os.path.exists('../script-results/processed-expression/'): | ||
os.makedirs('../script-results/processed-expression') | ||
|
||
# For simplicity, we assume that the program runs in the expression-database-folder. | ||
EXPRESSION_DATA_SOURCE = '../source-files/Expression 2020/ExpressionData.csv' | ||
EXPRESSION_DATA_DESTINATION = '../script-results/processed-expression/expression-data.csv' | ||
EXPRESSION_SHEET_DESTINATION = '../script-results/processed-expression/expression-sheet.csv' | ||
GENES_DESTINATION = '../script-results/processed-expression/genes.csv' | ||
|
||
genes = {} | ||
expression_data = [] | ||
expression_sheets = {} | ||
print(f'Processing file {EXPRESSION_DATA_SOURCE}') | ||
with open(EXPRESSION_DATA_SOURCE, 'r+', encoding="UTF-8") as f: | ||
i = 0 | ||
replicate_count = 0 | ||
prev_dataset = "" | ||
reader = csv.reader(f) | ||
for row in reader: | ||
if i != 0: | ||
col_num = 0 | ||
display_gene_id = row[2].replace('\t','') | ||
gene_id = row[1].replace('\t','') | ||
sort_index = row[0] | ||
sample_id = row[4] | ||
expression = row[5] | ||
time_points = row[6] | ||
dataset = row[7] | ||
# update the objects | ||
if gene_id not in genes: | ||
genes.update({gene_id : [display_gene_id, species, taxon_id]}) | ||
expression_data.append([gene_id, taxon_id, sort_index, sample_id, expression, time_points, dataset]) | ||
i+=1 | ||
print(f'Creating {EXPRESSION_DATA_DESTINATION}\n') | ||
expression_data_file = open(EXPRESSION_DATA_DESTINATION, 'w') | ||
expression_data_file.write(f'Gene ID\tTaxon ID\tSort Index\tSample ID\tExpression\tTime Points\tDataset\n') | ||
for d in expression_data: | ||
result = '{}\t{}\t{}\t{}\t{}\t{}\t{}'.format(d[0], d[1], d[2], d[3], d[4], d[5], d[6]) | ||
expression_data_file.write(f'{result}\n') | ||
expression_data_file.close() | ||
|
||
# Expression Metadata | ||
EXPRESSION_METADATA_SOURCE = '../source-files/Expression 2020/ExpressionMetadata.csv' | ||
EXPRESSION_METADATA_DESTINATION = '../script-results/processed-expression/expression-metadata.csv' | ||
# Add Dalquist Data Here | ||
expression_metadata = [ | ||
# [1, 'GSE83656', '', 'control_yeast_strain', 'treatment_yeast_strain', 'control', 'treatment', 'concentration_value', 'concentration_unit', 'time_value', 'time_unit', 'number_of_replicates,', 'expression_table'], | ||
# [3, 'GSE83656', '', 'control_yeast_strain', 'treatment_yeast_strain', 'control', 'treatment', 'concentration_value', 'concentration_unit', 'time_value', 'time_unit', 'number_of_replicates,', 'expression_table'], | ||
# [2, 'GSE83656', '', 'control_yeast_strain', 'treatment_yeast_strain', 'control', 'treatment', 'concentration_value', 'concentration_unit', 'time_value', 'time_unit', 'number_of_replicates,', 'expression_table'], | ||
# [4, 'GSE83656', '', 'control_yeast_strain', 'treatment_yeast_strain', 'control', 'treatment', 'concentration_value', 'concentration_unit', 'time_value', 'time_unit', 'number_of_replicates,', 'expression_table'], | ||
] | ||
|
||
pubmed_to_geo_conversion = { | ||
'12269742': 'GSE9336', | ||
'17327492': 'GSE6129', | ||
'23039231': 'GSE24712' | ||
} | ||
|
||
print(f'Processing file {EXPRESSION_METADATA_SOURCE}') | ||
with open(EXPRESSION_METADATA_SOURCE, 'r+', encoding="UTF-8") as f: | ||
i = 0 | ||
reader = csv.reader(f) | ||
for row in reader: | ||
if i != 0: | ||
# replicate_index = row[0][-1] | ||
pubmed_id = row[1] | ||
geo_id = pubmed_to_geo_conversion[pubmed_id] | ||
control_yeast_strain = row[2] | ||
treatment_yeast_strain = row[3] | ||
control = row[4] | ||
treatment = row[5] | ||
concentration_value = row[6] | ||
concentration_unit = row[7] | ||
time_value = row[8] | ||
time_unit = row[9] | ||
number_of_replicates = row[10] | ||
expression_table = row[11] | ||
|
||
expression_metadata.append([geo_id, pubmed_id, control_yeast_strain, treatment_yeast_strain, control, treatment, concentration_value, concentration_unit, time_value, time_unit, number_of_replicates, expression_table]) | ||
# next row | ||
i+= 1 | ||
|
||
print(f'Creating {EXPRESSION_METADATA_DESTINATION}\n') | ||
expression_metadata_file = open(EXPRESSION_METADATA_DESTINATION, 'w') | ||
expression_metadata_file.write(f'NCBI GEO ID\tPubmed ID\tControl Yeast Strain\tTreatment Yeast Strain\tControl\tTreatment\tConcentration Value\tConcentration Unit\tTime Value\tTime Units\tNumber of Replicates\tExpression Table\n') | ||
for m in expression_metadata: | ||
expression_metadata_file.write(f'{m[0]}\t{m[1]}\t{m[2]}\t{m[3]}\t{m[4]}\t{m[5]}\t{m[6]}\t{m[7]}\t{m[8]}\t{m[9]}\t{m[10]}\t{m[11]}\n') | ||
expression_metadata_file.close() | ||
|
||
|
||
# Refs csv file generation (She is smol so we write her ourselves) | ||
refs = [ | ||
# [pubmed_id, authors, publication_year, title, doi, ncbi_geo_id] | ||
['12269742', 'Kitagawa E., Takahashi J., Momose Y., Iwahashi H.', '2002', 'Effects of the Pesticide Thiuram: Genome-wide Screening of Indicator Genes by Yeast DNA Microarray', '10.1021/es015705v', 'GSE9336'], | ||
['17327492', 'Thorsen, M., Lagniel, G., Kristiansson, E., Junot, C., Nerman, O., Labarre, J., & Tamás, M. J.', '2007', 'Quantitative transcriptome, proteome, and sulfur metabolite profiling of the Saccharomyces cerevisiae response to arsenite.', '10.1152/physiolgenomics.00236.2006', 'GSE6129'], | ||
['23039231', 'Barreto, L., Canadell, D., Valverde‐Saubí, D., Casamayor, A., & Ariño, J.', '2012', 'The short‐term response of yeast to potassium starvation', '10.1111/j.1462-2920.2012.02887.x', 'GSE24712'], | ||
['', 'Dahlquist KD, Abdulla H, Arnell AJ, Arsan C, Baker JM, Carson RM, Citti WT, De Las Casas SE, Ellis LG, Entzminger KC, Entzminger SD, Fitzpatrick BG, Flores SP, Harmon NS, Hennessy KP, Herman AF, Hong MV, King HL, Kubeck LN, La-Anyane OM, Land DL, Leon Guerrero MJ, Liu EM, Luu MD, McGee KP, Mejia MR, Melone SN, Pepe NT, Rodriguez KR, Rohacz NA, Rovetti RJ, Sakhon OS, Sampana JT, Sherbina K, Terada LH, Vega AJ, Wavrin AJ, Wyllie KW, Zapata BB', | ||
'2018', 'Global transcriptional response of wild type and transcription factor deletion strains of Saccharomyces cerevisiae to the environmental stress of cold shock and subsequent recovery', | ||
'', 'GSE83656'], | ||
['25161313', 'Neymotin, B., Athanasiadou R., and Gresham D.', '2014', ' Determination of in vivo RNA kinetics using RATE-seq. RNA, 20, 1645-1652.', '10.1261/rna.045104.114', ''] | ||
] | ||
|
||
REFS_DESTINATION = '../script-results/processed-expression/refs.csv' | ||
print(f'Creating {REFS_DESTINATION}\n') | ||
refs_file = open(REFS_DESTINATION, 'w') | ||
refs_file.write(f'Pubmed ID\tAuthors\tPublication Year\tTitle\tDOI\tNCBI GEO ID\n') | ||
for r in refs: | ||
result = '{}\t{}\t{}\t{}\t{}\t{}'.format(r[0], r[1], r[2], r[3], r[4], r[5]) | ||
refs_file.write(f'{result}\n') | ||
refs_file.close() | ||
|
||
# Degradation Rates | ||
DEGRADATION_RATES_SOURCE = '../source-files/Expression 2020/DegradationRates.csv' | ||
DEGRADATION_RATES_DESTINATION = '../script-results/processed-expression/degradation-rates.csv' | ||
|
||
degradation_rates = [] | ||
|
||
print(f'Processing file {DEGRADATION_RATES_SOURCE}') | ||
with open(DEGRADATION_RATES_SOURCE, 'r+', encoding="UTF-8") as f: | ||
i = 0 | ||
reader = csv.reader(f) | ||
for row in reader: | ||
if i != 0: | ||
gene_id = row[0] | ||
display_gene_id = row[1] | ||
degradation_rate = row[2] | ||
pubmed_id = "25161313" | ||
geo_id = "" | ||
degradation_rates.append([gene_id, taxon_id, geo_id, pubmed_id, degradation_rate]) | ||
if gene_id not in genes: | ||
genes.update({gene_id : [display_gene_id, species, taxon_id]}) | ||
i+= 1 | ||
|
||
print(f'Creating {DEGRADATION_RATES_DESTINATION}\n') | ||
degradation_rates_file = open(DEGRADATION_RATES_DESTINATION, 'w') | ||
degradation_rates_file.write(f'Gene ID\tTaxon ID\tNCBI GEO ID\tPubmed ID\tDegradation Rate\n') | ||
for r in degradation_rates: | ||
result = '{}\t{}\t{}\t{}\t{}'.format(r[0], r[1], r[2], r[3], r[4]) | ||
degradation_rates_file.write(f'{result}\n') | ||
degradation_rates_file.close() | ||
|
||
# Production Rates | ||
PRODUCTION_RATES_SOURCE = '../source-files/Expression 2020/ProductionRates.csv' | ||
PRODUCTION_RATES_DESTINATION = '../script-results/processed-expression/production-rates.csv' | ||
|
||
production_rates = [] | ||
|
||
print(f'Processing file {PRODUCTION_RATES_SOURCE}') | ||
with open(PRODUCTION_RATES_SOURCE, 'r+', encoding="UTF-8") as f: | ||
i = 0 | ||
reader = csv.reader(f) | ||
for row in reader: | ||
if i != 0: | ||
gene_id = row[0] | ||
display_gene_id = row[1] | ||
production_rate = row[2] | ||
pubmed_id = "25161313" | ||
geo_id = "" | ||
production_rates.append([gene_id, taxon_id, geo_id, pubmed_id, production_rate]) | ||
if gene_id not in genes: | ||
genes.update({gene_id : [display_gene_id, species, taxon_id]}) | ||
# next row | ||
i+= 1 | ||
|
||
print(f'Creating {PRODUCTION_RATES_DESTINATION}\n') | ||
production_rates_file = open(PRODUCTION_RATES_DESTINATION, 'w') | ||
production_rates_file.write(f'Gene ID\tTaxon ID\tNCBI GEO ID\tPubmed ID\tProduction Rate\n') | ||
for r in production_rates: | ||
result = '{}\t{}\t{}\t{}\t{}'.format(r[0], r[1], r[2], r[3], r[4]) | ||
production_rates_file.write(f'{result}\n') | ||
production_rates_file.close() | ||
|
||
|
||
print(f'Creating {GENES_DESTINATION}\n') | ||
genes_file = open(GENES_DESTINATION, 'w') | ||
genes_file.write(f'Gene ID\tDisplay Gene ID\tSpecies\tTaxon ID\n') | ||
for g in genes: | ||
result = '{}\t{}\t{}\t{}'.format(g, genes[g][0], genes[g][1], genes[g][2],) | ||
genes_file.write(f'{result}\n') | ||
genes_file.close() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,124 +1,88 @@ | ||
import {responseCustomWorkbookData} from "../setup-load-and-import-handlers"; | ||
|
||
// Expression DB Access Functions | ||
const buildExpressionTimepointsString = function (selection) { | ||
let timepoints = ""; | ||
selection.timepoints.forEach(x => timepoints += (x + ",")); | ||
return timepoints.substring(0, timepoints.length - 1); | ||
}; | ||
const buildExpressionGeneQuery = function (workbookGenes) { | ||
let genes = ""; | ||
workbookGenes.forEach(x => genes += (x.name + ",")); | ||
return genes.substring(0, genes.length - 1); | ||
import { responseCustomWorkbookData } from "../setup-load-and-import-handlers"; | ||
// General DB Access Functions | ||
const buildQueryURL = function(path, parameters) { | ||
const searchParams = new URLSearchParams(""); | ||
for (let p in parameters) { | ||
searchParams.append(p, parameters[p]); | ||
} | ||
return `${path}?${searchParams.toString()}`; | ||
}; | ||
|
||
const buildExpressionURL = function (selection, genes) { | ||
const baseQuery = `expressiondb?dataset=${selection.dataset}&genes=${buildExpressionGeneQuery(genes)}`; | ||
return selection.timepoints ? | ||
`${baseQuery}&timepoints=${buildExpressionTimepointsString(selection)}` : | ||
baseQuery; | ||
const responseData = (database, formData, queryURL) => { | ||
return new Promise(function(resolve) { | ||
const uploadRoute = queryURL; | ||
const fullUrl = [$(".service-root").val(), uploadRoute].join("/"); | ||
(formData | ||
? $.ajax({ | ||
url: fullUrl, | ||
data: formData, | ||
processData: false, | ||
contentType: false, | ||
type: "GET", | ||
crossDomain: true, | ||
}) | ||
: $.getJSON(fullUrl) | ||
) | ||
.done((data) => { | ||
resolve(data); | ||
}) | ||
.error(function() { | ||
console.log( | ||
`Error in accessing ${database} database. Result may just be loading.` | ||
); | ||
}); | ||
}); | ||
}; | ||
|
||
const responseExpressionData = (formData, queryURL) => { | ||
return new Promise(function (resolve) { | ||
const uploadRoute = queryURL; | ||
const fullUrl = [ $(".service-root").val(), uploadRoute ].join("/"); | ||
(formData ? | ||
$.ajax({ | ||
url: fullUrl, | ||
data: formData, | ||
processData: false, | ||
contentType: false, | ||
type: "GET", | ||
crossDomain: true | ||
}) : | ||
$.getJSON(fullUrl) | ||
).done((expressionData) => { | ||
resolve(expressionData); | ||
}).error(console.log("Error in accessing expression database. Result may just be loading.")); | ||
}); | ||
}; | ||
// Expression DB Access Functions | ||
|
||
const queryExpressionDatabase = (query) => { | ||
let queryURL = buildExpressionURL({dataset: query.dataset}, query.genes); | ||
return responseExpressionData("", queryURL); | ||
const queryURL = buildQueryURL("expressiondb", query); | ||
return responseData("expression", "", queryURL); | ||
}; | ||
|
||
// Network DB Access Functions | ||
|
||
const buildNetworkGenesQuery = (genes) => { | ||
let result = ""; | ||
for (let gene in genes) { | ||
result += `${gene},`; | ||
} | ||
return result.substring(0, result.length - 1); | ||
const queryNetworkDatabase = (query) => { | ||
const queryURL = buildQueryURL("networkdb", query); | ||
return responseData("network", "", queryURL); | ||
}; | ||
|
||
const buildNetworkURL = function (queryType, queryInfo) { | ||
let baseQuery = `networkdb?type=${queryType}`; | ||
if (queryInfo !== null) { | ||
for (let header in queryInfo) { | ||
if (header === "genes") { | ||
baseQuery += `&${header}=${buildNetworkGenesQuery(queryInfo[header])}`; | ||
} else { | ||
baseQuery += `&${header}=${queryInfo[header]}`; | ||
} | ||
} | ||
} | ||
return baseQuery; | ||
}; | ||
// Upload Custom Workbook Functions | ||
|
||
const responseNetworkData = (formData, queryURL) => { | ||
return new Promise(function (resolve) { | ||
const uploadRoute = queryURL; | ||
const fullUrl = [ $(".service-root").val(), uploadRoute ].join("/"); | ||
(formData ? | ||
$.ajax({ | ||
url: fullUrl, | ||
data: formData, | ||
processData: false, | ||
contentType: false, | ||
type: "GET", | ||
crossDomain: true | ||
}) : | ||
$.getJSON(fullUrl) | ||
).done((networkData) => { | ||
resolve(networkData); | ||
}).error(console.log("Error in accessing network database. Result may just be loading.")); | ||
}); | ||
const uploadCustomWorkbook = (workbook, grnState) => { | ||
const queryURL = buildQueryURL("upload-custom-workbook", workbook); | ||
return responseCustomWorkbookData(grnState, queryURL, workbook.name); | ||
}; | ||
|
||
const queryNetworkDatabase = (query) => { | ||
let queryURL = buildNetworkURL(query.type, query.info); | ||
return responseNetworkData("", queryURL); | ||
}; | ||
const constructFullUrl = (queryURL) => | ||
[$(".service-root").val(), queryURL].join("/"); | ||
|
||
// Upload Custom Workbook Functions | ||
const buildCustomWorkbookURL = (name, genes, links) => { | ||
let baseQuery = `upload-custom-workbook?name=${name}`; | ||
let genesString = ""; | ||
let linksString = ""; | ||
let genesByIndex = {}; | ||
let i = 0; | ||
for (let gene in genes) { | ||
genesString += `${genes[gene]},`; | ||
genesByIndex[gene] = i; | ||
i++; | ||
} | ||
for (let regulator in links) { | ||
for (let target of links[regulator]) { | ||
linksString += `${genesByIndex[regulator]}->${genesByIndex[target]},`; | ||
} | ||
} | ||
baseQuery += `&genes=${genesString.substring(0, genesString.length - 1)}`; | ||
baseQuery += `&links=${linksString.substring(0, linksString.length - 1)}`; | ||
return baseQuery; | ||
}; | ||
const getWorkbookFromForm = (formData, queryURL) => { | ||
const fullUrl = constructFullUrl(queryURL); | ||
|
||
const uploadCustomWorkbook = (workbook, grnState) => { | ||
let queryURL = buildCustomWorkbookURL(workbook.name, workbook.genes, workbook.links); | ||
return responseCustomWorkbookData(grnState, queryURL, workbook.name); | ||
// The presence of formData is taken to indicate a POST. | ||
return formData | ||
? $.ajax({ | ||
url: fullUrl, | ||
data: formData, | ||
processData: false, | ||
contentType: false, | ||
type: "POST", | ||
crossDomain: true, | ||
}) | ||
: $.getJSON(fullUrl); | ||
}; | ||
|
||
const getWorkbookFromUrl = (queryURL) => { | ||
const fullUrl = constructFullUrl(queryURL); | ||
return $.getJSON(fullUrl); | ||
}; | ||
|
||
export {queryExpressionDatabase, queryNetworkDatabase, uploadCustomWorkbook}; | ||
export { | ||
queryExpressionDatabase, | ||
queryNetworkDatabase, | ||
uploadCustomWorkbook, | ||
getWorkbookFromForm, | ||
getWorkbookFromUrl, | ||
}; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
350 changes: 181 additions & 169 deletions
350
web-client/public/js/setup-load-and-import-handlers.js
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,200 +1,212 @@ | ||
import { updateApp } from "./update-app"; | ||
|
||
import { | ||
DEMO_INFORMATION, | ||
UNWEIGHTED_DEMO_PATH, | ||
WEIGHTED_DEMO_PATH, | ||
SCHADE_INPUT_PATH, | ||
SCHADE_OUTPUT_PATH, | ||
WEIGHTED_DEMO_NAME, | ||
UNWEIGHTED_DEMO_NAME, | ||
SCHADE_INPUT_NAME, | ||
SCHADE_OUTPUT_NAME, | ||
DEMO_INFORMATION, | ||
UNWEIGHTED_DEMO_PATH, | ||
WEIGHTED_DEMO_PATH, | ||
SCHADE_INPUT_PATH, | ||
SCHADE_OUTPUT_PATH, | ||
WEIGHTED_DEMO_NAME, | ||
UNWEIGHTED_DEMO_NAME, | ||
SCHADE_INPUT_NAME, | ||
SCHADE_OUTPUT_NAME, | ||
} from "./constants"; | ||
import { getWorkbookFromForm, getWorkbookFromUrl } from "./api/grnsight-api"; | ||
|
||
const demoFiles = [UNWEIGHTED_DEMO_PATH, WEIGHTED_DEMO_PATH, SCHADE_INPUT_PATH, SCHADE_OUTPUT_PATH]; | ||
const demoFiles = [ | ||
UNWEIGHTED_DEMO_PATH, | ||
WEIGHTED_DEMO_PATH, | ||
SCHADE_INPUT_PATH, | ||
SCHADE_OUTPUT_PATH, | ||
]; | ||
|
||
const submittedFilename = $upload => { | ||
let path = $upload.val(); | ||
let fakePathCheck = path.search("\\\\") + 1; | ||
const submittedFilename = ($upload) => { | ||
let path = $upload.val(); | ||
let fakePathCheck = path.search("\\\\") + 1; | ||
|
||
while (fakePathCheck) { | ||
path = path.substring(fakePathCheck); | ||
fakePathCheck = path.search("\\\\") + 1; | ||
} | ||
while (fakePathCheck) { | ||
path = path.substring(fakePathCheck); | ||
fakePathCheck = path.search("\\\\") + 1; | ||
} | ||
|
||
return path; | ||
return path; | ||
}; | ||
|
||
const createFileForm = $upload => { | ||
const formData = new FormData(); | ||
formData.append("file", $upload[0].files[0]); | ||
return formData; | ||
const createFileForm = ($upload) => { | ||
const formData = new FormData(); | ||
formData.append("file", $upload[0].files[0]); | ||
return formData; | ||
}; | ||
|
||
const uploadEpilogue = event => { | ||
if (window.ga) { | ||
window.ga("send", "pageview", { | ||
page: "/GRNsight/upload", | ||
sessionControl: "start" | ||
}); | ||
} | ||
const uploadEpilogue = (event) => { | ||
if (window.ga) { | ||
window.ga("send", "pageview", { | ||
page: "/GRNsight/upload", | ||
sessionControl: "start", | ||
}); | ||
} | ||
|
||
$("a.upload > input[type=file]").val(""); | ||
event.preventDefault(); | ||
$("a.upload > input[type=file]").val(""); | ||
event.preventDefault(); | ||
}; | ||
const disableUpload = state => { | ||
$(".upload").attr("disabled", state); | ||
$(".upload-sif").attr("disabled", state); | ||
$(".upload-graphml").attr("disabled", state); | ||
const disableUpload = (state) => { | ||
$(".upload").attr("disabled", state); | ||
$(".upload-sif").attr("disabled", state); | ||
$(".upload-graphml").attr("disabled", state); | ||
}; | ||
|
||
const uploadHandler = (uploader) => { | ||
return function (event) { // Must be `function` due to use of `this`. | ||
const $upload = $(this); | ||
const filename = submittedFilename($upload); // TODO: remove before master release (beta@4.0.6) | ||
if ($upload[0].files[0].size < 2000000) { | ||
// disable upload button to prevent multiple uploads | ||
disableUpload(true); | ||
const formData = createFileForm($upload); | ||
uploader(filename, formData); | ||
uploadEpilogue(event); | ||
} else { | ||
let errorString = "The file uploaded is too large. Please try again with a file smaller than 1 MB."; | ||
$("#error").html(errorString); | ||
$("#errorModal").modal("show"); | ||
} | ||
}; | ||
}; | ||
|
||
const workbookErrorDisplayer = xhr => { | ||
// re-enable upload button | ||
disableUpload(false); | ||
// Deleted status, error for argument because it was never used | ||
const err = JSON.parse(xhr.responseText); | ||
let errorString = "Your graph failed to load.<br><br>"; | ||
|
||
if (!err.errors) { // will be falsy if an error was thrown before the workbook was generated | ||
errorString += err; | ||
return function(event) { | ||
// Must be `function` due to use of `this`. | ||
const $upload = $(this); | ||
const filename = submittedFilename($upload); // TODO: remove before master release (beta@4.0.6) | ||
if ($upload[0].files[0].size < 2000000) { | ||
// disable upload button to prevent multiple uploads | ||
disableUpload(true); | ||
const formData = createFileForm($upload); | ||
uploader(filename, formData); | ||
uploadEpilogue(event); | ||
} else { | ||
errorString = err.errors.reduce( | ||
(currentErrorString, currentError) => | ||
`${currentErrorString}${currentError.possibleCause} ${currentError.suggestedFix}<br><br>`, | ||
|
||
errorString | ||
); | ||
let errorString = | ||
"The file uploaded is too large. Please try again with a file smaller than 1 MB."; | ||
$("#error").html(errorString); | ||
$("#errorModal").modal("show"); | ||
} | ||
|
||
$("#error").html(errorString); | ||
$("#errorModal").modal("show"); | ||
}; | ||
}; | ||
|
||
let reloader = () => { }; | ||
|
||
|
||
const returnUploadRoute = filename => { | ||
if (demoFiles.indexOf(filename) !== -1) { | ||
return filename; | ||
} else if (filename.includes(".xlsx")) { | ||
return "upload"; | ||
} else if (filename.includes(".sif")) { | ||
return "upload-sif"; | ||
} else if (filename.includes(".graphml")) { | ||
return "upload-graphml"; | ||
} | ||
const workbookErrorDisplayer = (xhr) => { | ||
// re-enable upload button | ||
disableUpload(false); | ||
// Deleted status, error for argument because it was never used | ||
const err = JSON.parse(xhr.responseText); | ||
let errorString = "Your graph failed to load.<br><br>"; | ||
|
||
if (!err.errors) { | ||
// will be falsy if an error was thrown before the workbook was generated | ||
errorString += err; | ||
} else { | ||
errorString = err.errors.reduce( | ||
(currentErrorString, currentError) => | ||
`${currentErrorString}${currentError.possibleCause} ${currentError.suggestedFix}<br><br>`, | ||
|
||
errorString | ||
); | ||
} | ||
|
||
$("#error").html(errorString); | ||
$("#errorModal").modal("show"); | ||
}; | ||
|
||
export const setupLoadAndImportHandlers = grnState => { | ||
const loadGrn = (name, formData) => { | ||
const uploadRoute = returnUploadRoute(name); | ||
const fullUrl = [ $(".service-root").val(), uploadRoute ].join("/"); | ||
// The presence of formData is taken to indicate a POST. | ||
(formData ? | ||
$.ajax({ | ||
url: fullUrl, | ||
data: formData, | ||
processData: false, | ||
contentType: false, | ||
type: "POST", | ||
crossDomain: true | ||
}) : | ||
$.getJSON(fullUrl) | ||
).done((workbook, textStatus, jqXhr) => { | ||
grnState.name = name || jqXhr.getResponseHeader("X-GRNsight-Filename"); | ||
if (demoFiles.indexOf(name) > -1) { | ||
switch (name) { | ||
case WEIGHTED_DEMO_PATH: | ||
grnState.name = WEIGHTED_DEMO_NAME; | ||
break; | ||
case UNWEIGHTED_DEMO_PATH: | ||
grnState.name = UNWEIGHTED_DEMO_NAME; | ||
break; | ||
case SCHADE_INPUT_PATH: | ||
grnState.name = SCHADE_INPUT_NAME; | ||
break; | ||
case SCHADE_OUTPUT_PATH: | ||
grnState.name = SCHADE_OUTPUT_NAME; | ||
} | ||
} | ||
grnState.workbook = workbook; | ||
if (uploadRoute !== "upload") { | ||
grnState.annotateLinks(); | ||
} | ||
reloader = () => loadGrn(name, formData); | ||
// re-enable upload button | ||
disableUpload(false); | ||
updateApp(grnState); | ||
// displayStatistics(workbook); | ||
}).error(workbookErrorDisplayer); | ||
}; | ||
/* | ||
* Thanks to http://stackoverflow.com/questions/6974684/how-to-send-formdata-objects-with-ajax-requests-in-jquery | ||
* for helping to resolve this. | ||
*/ | ||
|
||
// $(".upload").change(uploadHandler(loadGrn)); | ||
$("body").on("change", ".upload", uploadHandler(loadGrn)); | ||
const loadDemo = (url, value) => { | ||
$("#demoSourceDropdown option[value='" + value.substring(1) + "']").prop("selected", true); | ||
loadGrn(url); | ||
reloader = () => loadGrn(url); | ||
|
||
$("a.upload > input[type=file]").val(""); | ||
}; | ||
|
||
const initializeDemoFile = (demoClass, demoPath, demoName) => { | ||
// Deleted parameter `event` | ||
$(demoClass).on("click", () => { | ||
loadDemo(demoPath, demoClass, demoName); | ||
}); | ||
|
||
$("#demoSourceDropdown").on("change", () => { | ||
loadDemo(demoPath, demoClass, demoName); | ||
}); | ||
}; | ||
|
||
DEMO_INFORMATION.forEach(demoInfo => initializeDemoFile.apply(null, demoInfo)); | ||
|
||
$("body").on("click", ".reload", function () { | ||
// Deleted `event` parameter but need `function` because of `this`. | ||
if (!$(this).parent().hasClass("disabled")) { | ||
if ($.isFunction(reloader)) { | ||
reloader(); | ||
} | ||
} | ||
}); | ||
let reloader = () => {}; | ||
|
||
const returnUploadRoute = (filename) => { | ||
if (demoFiles.indexOf(filename) !== -1) { | ||
return filename; | ||
} else if (filename.includes(".xlsx")) { | ||
return "upload"; | ||
} else if (filename.includes(".sif")) { | ||
return "upload-sif"; | ||
} else if (filename.includes(".graphml")) { | ||
return "upload-graphml"; | ||
} | ||
}; | ||
|
||
export const responseCustomWorkbookData = (grnState, queryURL, name) => { | ||
const uploadRoute = queryURL; | ||
const fullUrl = [ $(".service-root").val(), uploadRoute ].join("/"); | ||
$.getJSON(fullUrl).done((workbook) => { | ||
grnState.name = name; | ||
export const setupLoadAndImportHandlers = (grnState) => { | ||
const loadGrn = (name, formData) => { | ||
const uploadRoute = returnUploadRoute(name); | ||
// The presence of formData is taken to indicate a POST. | ||
getWorkbookFromForm(formData, uploadRoute) | ||
.done((workbook, textStatus, jqXhr) => { | ||
grnState.name = name || jqXhr.getResponseHeader("X-GRNsight-Filename"); | ||
if (demoFiles.indexOf(name) > -1) { | ||
switch (name) { | ||
case WEIGHTED_DEMO_PATH: | ||
grnState.name = WEIGHTED_DEMO_NAME; | ||
break; | ||
case UNWEIGHTED_DEMO_PATH: | ||
grnState.name = UNWEIGHTED_DEMO_NAME; | ||
break; | ||
case SCHADE_INPUT_PATH: | ||
grnState.name = SCHADE_INPUT_NAME; | ||
break; | ||
case SCHADE_OUTPUT_PATH: | ||
grnState.name = SCHADE_OUTPUT_NAME; | ||
} | ||
} | ||
grnState.workbook = workbook; | ||
grnState.annotateLinks(); | ||
grnState.workbook.expressionNames = Object.keys(workbook.expression); | ||
if (uploadRoute !== "upload") { | ||
grnState.annotateLinks(); | ||
} | ||
reloader = () => loadGrn(name, formData); | ||
// re-enable upload button | ||
disableUpload(false); | ||
updateApp(grnState); | ||
reloader = () => responseCustomWorkbookData(grnState, queryURL, name); | ||
// displayStatistics(workbook); | ||
}) | ||
.error(workbookErrorDisplayer); | ||
}; | ||
/* | ||
* Thanks to http://stackoverflow.com/questions/6974684/how-to-send-formdata-objects-with-ajax-requests-in-jquery | ||
* for helping to resolve this. | ||
*/ | ||
|
||
// $(".upload").change(uploadHandler(loadGrn)); | ||
$("body").on("change", ".upload", uploadHandler(loadGrn)); | ||
const loadDemo = (url, value) => { | ||
$("#demoSourceDropdown option[value='" + value.substring(1) + "']").prop( | ||
"selected", | ||
true | ||
); | ||
loadGrn(url); | ||
reloader = () => loadGrn(url); | ||
|
||
$("a.upload > input[type=file]").val(""); | ||
}; | ||
|
||
const initializeDemoFile = (demoClass, demoPath, demoName) => { | ||
// Deleted parameter `event` | ||
$(demoClass).on("click", () => { | ||
loadDemo(demoPath, demoClass, demoName); | ||
}); | ||
$("#demoSourceDropdown").on("change", () => { | ||
const selected = `.${$("#demoSourceDropdown").val()}`; | ||
if (selected === demoClass) { | ||
loadDemo(demoPath, demoClass, demoName); | ||
} | ||
}); | ||
}; | ||
|
||
DEMO_INFORMATION.forEach((demoInfo) => | ||
initializeDemoFile.apply(null, demoInfo) | ||
); | ||
|
||
$("body").on("click", ".reload", function() { | ||
// Deleted `event` parameter but need `function` because of `this`. | ||
if ( | ||
!$(this) | ||
.parent() | ||
.hasClass("disabled") | ||
) { | ||
if ($.isFunction(reloader)) { | ||
reloader(); | ||
} | ||
} | ||
}); | ||
}; | ||
|
||
export const responseCustomWorkbookData = (grnState, queryURL, name) => { | ||
const uploadRoute = queryURL; | ||
getWorkbookFromUrl(uploadRoute).done((workbook) => { | ||
grnState.name = name; | ||
grnState.workbook = workbook; | ||
// Reset the node coloring dataset selection | ||
grnState.nodeColoring.topDataset = undefined; | ||
grnState.nodeColoring.bottomDataset = undefined; | ||
grnState.annotateLinks(); | ||
disableUpload(false); | ||
updateApp(grnState); | ||
reloader = () => responseCustomWorkbookData(grnState, queryURL, name); | ||
}); | ||
}; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.