Skip to content

Commit 22f5a9d

Browse files
authored
Add files via upload
0 parents  commit 22f5a9d

19 files changed

+7544
-0
lines changed

README.md

+29
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
# TUGDA: Task uncertainty guided domain adaptation for robust generalization of cancer drug response prediction from in *in vitro* to *in vivo* settings
2+
3+
TUGDA is a novel multi-task unsupervised domain adaptation method that leverages transfer learning from tasks/domains in a unified framework by quantifying uncertainty in predictors and weighting their influence on shared domain/task feature representations. TUGDA's ability to rely more on predictors with low-uncertainty allowed it to notably reduce cases of negative transfer and sucessfully transfer knowledge across biological models.
4+
5+
**Model Representation:**
6+
![Image of Yaktocat](https://github.com/CSB5/TUGDA/blob/main/fig1_model.png)
7+
8+
9+
TUGDA framework for multi-task learning and domain adaptation in cancer drug response prediction. The layer L receives input data from different biological models and maps them to a latent space Z. Then, the multi-task layer S uses these latent features to make predictions, as well as compute task-uncertainties U t for regularizing the amount of transfer from tasks/domains in A to the latent features in Z by employing an autoencoder regularization. Using adversarial learning, the discriminator D receives the extracted features from Z and regularizes L to learn domain-invariant features. L, S, A and D consist of a single fully connected layer.
10+
11+
# Usage examples
12+
13+
We provide two notebooks as examples of how the training and testing is perfomed using TUGDA's framework. Both notebooks are self-contained (e.g., install required libs and load the necessary data).
14+
15+
Step zero: bash setup_repo.sh
16+
17+
1) For MTL settings, please refer to [notebooks](https://github.com/CSB5/TUGDA/blob/main/tugda_mtl_example.ipynb). In this notebook you can reproduce TUGDA's result for Figure 2.
18+
19+
2) For Domain Adaptation settings, please refer to [notebooks](https://github.com/CSB5/TUGDA/blob/main/tugda_da_example.ipynb). In this notebook, you can reproduce TUGDA's result for the domain adaptation from cell-lines to PDX (Figure 3).
20+
21+
# Data
22+
In this repository we used data from the publicly available [GDSC](https://www.cancerrxgene.org/) and [PDX Novartis](https://www.nature.com/articles/nm.3954) datasets.
23+
24+
# Citation
25+
Preprint to be released.
26+
27+
# Contact information
28+
29+
For additional information, help and bug reports please email Rafael Peres da Silva ([[email protected]](mailto:[email protected]))

data/GDSCDA_fpkm_AUC_all_drugs.zip

13 MB
Binary file not shown.

data/PDX_MTL_DA.csv

+400
Large diffs are not rendered by default.

data/cl_x_test_o_k1.csv

+270
Large diffs are not rendered by default.

data/cl_x_test_o_k2.csv

+269
Large diffs are not rendered by default.

data/cl_x_test_o_k3.csv

+269
Large diffs are not rendered by default.

data/cl_x_train_o_k1.csv

+537
Large diffs are not rendered by default.

data/cl_x_train_o_k2.csv

+538
Large diffs are not rendered by default.

data/cl_x_train_o_k3.csv

+538
Large diffs are not rendered by default.

data/cl_y_test_o_k1.csv

+270
Large diffs are not rendered by default.

data/cl_y_test_o_k2.csv

+269
Large diffs are not rendered by default.

data/cl_y_test_o_k3.csv

+269
Large diffs are not rendered by default.

data/cl_y_train_o_k1.csv

+537
Large diffs are not rendered by default.

data/cl_y_train_o_k2.csv

+538
Large diffs are not rendered by default.

data/cl_y_train_o_k3.csv

+538
Large diffs are not rendered by default.

fig1_model.png

99.5 KB
Loading

setup_repo.sh

+6
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
conda create --name TUGDA_rep python=3.8
2+
source activate TUGDA_rep
3+
pip install jupyter
4+
pip install ipykernel
5+
python -m ipykernel install --user --name TUGDA_rep --display-name "Python (TUGDA_rep)"
6+
jupyter notebook

0 commit comments

Comments
 (0)