-
|
I am currently working on a modified implementation of the GNN layer GarNet (see here). The model has multiple outputs: One for regression and one for classification. I am passing generated The NPY format only supports storing one single numpy array per file. As far as I am aware, it is not possible to concat numpy arrays of different dimensions into one. Hence, I can not directly pass multiple inputs/outputs as tb data to So my question is: How can one structure tb data with multiple inputs/outputs of different sizes such that they are valid in Thanks in advance. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
|
This feature is not as fully flexible as the Also note that this is only useful if you're doing co-simulation where latency (running time) depends on the input, and you want to get the correct timing information. Otherwise, if you're testing precision/accuracy performance, converting from memory to text file may result in a loss, so use |
Beta Was this translation helpful? Give feedback.
This feature is not as fully flexible as the
predict()call where you can pass live numpy objects. There are constraints, mainly that the number of samples (the first dimension of your inputs) must be the same. For example, let's say you work with tensorsxof shape(batch, 10,2)andyof shape(batch,5,3), thisbatchdimension must be the same. Let's say that's 100 (batch=100). Then you would pass an array of(100, 10*2 + 5*3)as a.datfile where each line corresponds to a flattened sample, so there's 100 lines each with 35 values separated by a space. Same for output predictions. You can write a script that can produce such files. Unless there are bugs, this should work.Also note that…