Skip to content

Commit 00f4adb

Browse files
authored
Merge branch 'JuliaLogging:master' into fix-MultilineChartContent-not-defined
2 parents 2ad5f15 + 37bae6b commit 00f4adb

File tree

134 files changed

+10970
-9796
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

134 files changed

+10970
-9796
lines changed

.github/workflows/UnitTest.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ jobs:
1313
fail-fast: false
1414
matrix:
1515
os: [macos-latest, ubuntu-latest]
16-
julia_version: ["1.3", "1", "nightly"]
16+
julia_version: ["1.6", "1", "nightly"]
1717

1818
runs-on: ${{ matrix.os }}
1919
env:

.gitignore

+1-1
Original file line numberDiff line numberDiff line change
@@ -6,4 +6,4 @@ test/test_logs
66
docs/Manifest.toml
77

88
gen/proto
9-
gen/protojl
9+
gen/protojl

Project.toml

+8-5
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name = "TensorBoardLogger"
22
uuid = "899adc3e-224a-11e9-021f-63837185c80f"
33
authors = ["Filippo Vicentini <[email protected]>"]
4-
version = "0.1.20"
4+
version = "0.1.23"
55

66
[deps]
77
CRC32c = "8bf52ea8-c179-5cab-976a-9e18b702a9bc"
@@ -14,17 +14,20 @@ StatsBase = "2913bbd2-ae8a-5f71-8c99-4fb6c76f3a91"
1414
[compat]
1515
FileIO = "1.2.3"
1616
ImageCore = "0.8.1, 0.9"
17-
ProtoBuf = "0.10, 0.11"
17+
ProtoBuf = "1.0.11"
1818
Requires = "0.5, 1"
19-
StatsBase = "0.27, 0.28, 0.29, 0.30, 0.31, 0.32, 0.33"
20-
julia = "1.3"
19+
StatsBase = "0.27, 0.28, 0.29, 0.30, 0.31, 0.32, 0.33, 0.34"
20+
julia = "1.6"
2121

2222
[extras]
23-
Minio = "4281f0d9-7ae0-406e-9172-b7277c1efa20"
23+
Cairo = "159f3aea-2a34-519c-b102-8c37f9878175"
24+
Fontconfig = "186bb1d3-e1f7-5a2c-a377-96d770f13627"
25+
Gadfly = "c91e804a-d5a3-530f-b6f0-dfbca275c004"
2426
ImageMagick = "6218d12a-5da1-5696-b52f-db25d2ecc6d1"
2527
LightGraphs = "093fc24a-ae57-5d10-9952-331d41423f4d"
2628
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
2729
MLDatasets = "eb30cadb-4394-5ae3-aed4-317e484a6458"
30+
Minio = "4281f0d9-7ae0-406e-9172-b7277c1efa20"
2831
Plots = "91a5bcdd-55d7-5caf-9e0b-520d859cae80"
2932
PyPlot = "d330b81b-6aea-500a-939a-2ce795aea3ee"
3033
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -39,8 +39,8 @@ logger in Julia:
3939

4040
You can log to TensorBoard any type. Numeric types will be logged as scalar,
4141
arrays will be binned into histograms, images and audio will be logged as such,
42-
and we even support [Plots](https://github.com/JuliaPlots/Plots.jl) and
43-
[PyPlot](https://github.com/JuliaPlots/Plots.jl) figures!
42+
and we even support [Plots](https://github.com/JuliaPlots/Plots.jl),
43+
[PyPlot](https://github.com/JuliaPlots/Plots.jl) and [Gadfly](https://github.com/GiovineItalia/Gadfly.jl) figures!
4444

4545
For details about how types are logged by default, or how to customize this behaviour for your custom types,
4646
refer to the documentation or the examples folder.
@@ -71,7 +71,7 @@ end
7171
```
7272

7373
## Integration with third party packages
74-
We also support native logging of the types defined by a few third-party packages, such as `Plots` and `PyPlot` plots.
74+
We also support native logging of the types defined by a few third-party packages, such as `Plots`, `PyPlot` and `Gadfly` plots.
7575
If there are other libraries that you think we should include in the list, please open an issue.
7676

7777
## Roadmap

docs/make.jl

+3-1
Original file line numberDiff line numberDiff line change
@@ -10,11 +10,13 @@ makedocs(
1010
"Backends" => "custom_behaviour.md",
1111
"Reading back data" => "deserialization.md",
1212
"Extending" => "extending_behaviour.md",
13-
"Explicit Interface" => "explicit_interface.md"
13+
"Explicit Interface" => "explicit_interface.md",
14+
"Hyperparameter logging" => "hyperparameters.md"
1415
],
1516
"Examples" => Any[
1617
"Flux.jl" => "examples/flux.md"
1718
"Optim.jl" => "examples/optim.md"
19+
"Hyperparameter tuning" => "examples/hyperparameter_tuning.md"
1820
]
1921
],
2022
format = Documenter.HTML(

docs/src/custom_behaviour.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ value is sent to:
55

66
- `::AbstractVector{<:Real}` -> [Histogram backend](https://www.tensorflow.org/guide/tensorboard_histograms) as a vector
77
- `::StatsBase.Histogram` -> [Histogram backend](https://www.tensorflow.org/guide/tensorboard_histograms)
8-
- `(bin_edges, weights)::Tuple{AbstractVector,AbstractVector}` where `length(bin_edges)==length(weights)+1`, is interpreted as an histogram. (*Will be deprecated.* Please use `TBHistogram(edges, weights)` for this).
8+
<!-- - `(bin_edges, weights)::Tuple{AbstractVector,AbstractVector}` where `length(bin_edges)==length(weights)+1`, is interpreted as an histogram. (*Will be deprecated.* Please use `TBHistogram(edges, weights)` for this). -->
99
- `::Real` -> Scalar backend
1010
- `::AbstractArray{<:Colorant}` -> [Image backend](https://www.tensorflow.org/tensorboard/r2/image_summaries)
1111
- `::Any` -> Text Backend
+53
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
# Hyperparameter tuning
2+
3+
We will start this example by setting up a simple random walk experiment, and seeing the effect of the hyperparameter `bias` on the results.
4+
5+
First, import the packages we will need with:
6+
```julia
7+
using TensorBoardLogger, Logging
8+
using Random
9+
```
10+
Next, we will create a function which runs the experiment and logs the results, include the hyperparameters stored in the `config` dictionary.
11+
```julia
12+
function run_experiment(id, config)
13+
logger = TBLogger("random_walk/run$id", tb_append)
14+
15+
# Specify all the metrics we want to track in a list
16+
metric_names = ["scalar/position"]
17+
write_hparams!(logger, config, metric_names)
18+
19+
epochs = config["epochs"]
20+
sigma = config["sigma"]
21+
bias = config["bias"]
22+
with_logger(logger) do
23+
x = 0.0
24+
for i in 1:epochs
25+
x += sigma * randn() + bias
26+
@info "scalar" position = x
27+
end
28+
end
29+
nothing
30+
end
31+
```
32+
Now we can write a script which runs an experiment over a set of parameter values.
33+
```julia
34+
id = 0
35+
for bias in LinRange(-0.1, 0.1, 11)
36+
for epochs in [50, 100]
37+
config = Dict(
38+
"bias"=>bias,
39+
"epochs"=>epochs,
40+
"sigma"=>0.1
41+
)
42+
run_experiment(id, config)
43+
id += 1
44+
end
45+
end
46+
```
47+
48+
Below is an example of the dashboard you get when you open Tensorboard with the command:
49+
```sh
50+
tensorboard --logdir=random_walk
51+
```
52+
53+
![tuning plot](tuning.png)

docs/src/examples/tuning.png

269 KB
Loading

docs/src/explicit_interface.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -44,7 +44,7 @@ See [TensorBoard Custom Scalar page](https://github.com/tensorflow/tensorboard/t
4444

4545
For example, to combine in the same plot panel the two curves logged under tags `"Curve/1"` and `"Curve/2"` you can run once the command:
4646
```julia
47-
layout = Dict("Cat" => Dict("Curve" => ("Multiline", ["Curve/1", "Curve/2"])))
47+
layout = Dict("Cat" => Dict("Curve" => (tb_multiline, ["Curve/1", "Curve/2"])))
4848

4949
log_custom_scalar(lg, layout)
5050

docs/src/extending_behaviour.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ At the end of this step, every pair in `objects` will be logged to a specific
1212
backend, according to the following rules:
1313

1414
- `::AbstractVector{<:Real}` -> [Histogram backend](https://www.tensorflow.org/guide/tensorboard_histograms) as a vector
15-
- `::Tuple{AbstractVector,AbstractVector}` [Histogram backend](https://www.tensorflow.org/guide/tensorboard_histograms) as an histogram
15+
<!-- - `::Tuple{AbstractVector,AbstractVector}` [Histogram backend](https://www.tensorflow.org/guide/tensorboard_histograms) as an histogram -->
1616
- `::Real` -> Scalar backend
1717
- `::AbstractArray{<:Colorant}` -> [Image backend](https://www.tensorflow.org/tensorboard/r2/image_summaries)
1818
- `::Any` -> Text Backend

docs/src/hyperparameters.md

+10
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Hyperparameter logging
2+
3+
In additition to logging the experiments, you may wish to also visualise the effect of hyperparameters on some plotted metrics. This can be done by logging the hyperparameters via the `write_hparams!` function, which takes a dictionary mapping hyperparameter names to their values (currently limited to `Real`, `Bool` or `String` types), along with the names of any metrics that you want to view the effects of.
4+
5+
You can see how the HParams dashboard in Tensorboard can be used to tune hyperparameters on the [tensorboard website](https://www.tensorflow.org/tensorboard/hyperparameter_tuning_with_hparams).
6+
7+
## API
8+
```@docs
9+
write_hparams!
10+
```

docs/src/index.md

+7-3
Original file line numberDiff line numberDiff line change
@@ -111,11 +111,15 @@ at [Reading back TensorBoard data](@ref)
111111
We also support logging custom types from a the following third-party libraries:
112112
- [Plots.jl](https://github.com/JuliaPlots/Plots.jl): the `Plots.Plot` type will be rendered to PNG at the resolution specified by the object and logged as an image
113113
- [PyPlot.jl](https://github.com/JuliaPy/PyPlot.jl): the `PyPlot.Figure` type will be rendered to PNG at the resolution specified by the object and logged as an image
114+
- [Gadfly.jl](https://github.com/GiovineItalia/Gadfly.jl): the `Gadfly.Plot` type will be rendered to PNG at the resolution specified by the object and logged as an image. `Cairo` and `Fontconfig` packages must be imported for this functionality to work as it is required by `Gadfly`.
114115
- [Tracker.jl](https://github.com/FluxML/Tracker.jl): the `TrackedReal` and `TrackedArray` types will be logged as vector data
115116
- [ValueHistories.jl](https://github.com/JuliaML/ValueHistories.jl): the `MVHistory` type is used to store the deserialized content of .proto files.
116117

117118
## Explicit logging
118119

119-
In alternative, you can also log data to TensorBoard through its functional interface,
120-
by calling the relevant method with a tag string and the data. For information
121-
on this interface refer to [Explicit interface](@ref)...
120+
As an alternative, you can also log data to TensorBoard through its functional interface, by calling the relevant method with a tag string and the data. For information on this interface refer to [Explicit interface](@ref).
121+
122+
## Hyperparameter tuning
123+
124+
Many experiments rely on hyperparameters, which can be difficult to tune. Tensorboard allows you to visualise the effect of your hyperparameters on your metrics, giving you an intuition for the correct hyperparameters for your task. For information on this API, see the [Hyperparameter logging](@ref) manual page.
125+

examples/Gadfly.jl

+14
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
using TensorBoardLogger #import the TensorBoardLogger package
2+
using Logging #import Logging package
3+
using Gadfly, Cairo, Fontconfig
4+
5+
logger = TBLogger("Gadflylogs", tb_append) #create tensorboard logger
6+
7+
################log scalars example: y = x²################
8+
#using logger interface
9+
x = rand(100)
10+
y = rand(100)
11+
p = plot(x=x, y=y, Geom.point);
12+
with_logger(logger) do
13+
@info "gadfly" plot=p
14+
end

examples/HParams.jl

+38
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,38 @@
1+
using TensorBoardLogger #import the TensorBoardLogger package
2+
using Logging #import Logging package
3+
using Random # Exports randn
4+
5+
# Run 10 experiments to see a plot
6+
for j in 1:10
7+
logger = TBLogger("random_walks/run$j", tb_append)
8+
9+
sigma = 0.1
10+
epochs = 200
11+
bias = (rand()*2 - 1) / 10 # create a random bias
12+
use_seed = false
13+
# Add in the a dummy loss metric
14+
with_logger(logger) do
15+
x = 0.0
16+
for i in 1:epochs
17+
x += sigma * randn() + bias
18+
@info "scalar" loss = x
19+
end
20+
end
21+
22+
# Hyperparameter is a dictionary of parameter names to their values. This
23+
# supports numerical types, bools and strings. Non-bool numerical types
24+
# are converted to Float64 to be displayed.
25+
hparams_config = Dict{String, Any}(
26+
"sigma"=>sigma,
27+
"epochs"=>epochs,
28+
"bias"=>bias,
29+
"use_seed"=>use_seed,
30+
"method"=>"MC"
31+
)
32+
# Specify a list of tags that you want to show up in the hyperparameter
33+
# comparison
34+
metrics = ["scalar/loss"]
35+
36+
# Write the hyperparameters and metrics config to the logger.
37+
write_hparams!(logger, hparams_config, metrics)
38+
end

examples/Histograms.jl

+5-4
Original file line numberDiff line numberDiff line change
@@ -10,9 +10,10 @@ with_logger(logger) do
1010
x0 = 0.5+i/30; s0 = 0.5/(i/20);
1111
edges = collect(-5:0.1:5)
1212
centers = collect(edges[1:end-1] .+0.05)
13-
histvals = [exp(-((c-x0)/s0)^2) for c = centers]
13+
histvals = s0 * randn(length(centers)) .+ x0
1414
data_tuple = (edges, histvals)
15-
@info "histogram/loggerinterface" autobin=rand(10).+0.1*i manualbin=data_tuple
15+
@info "histogram/loggerinterface" autobin=s0 .* randn(100) .+ x0
16+
@info "histogram/loggerinterface" manualbin=data_tuple
1617
end
1718
end
1819

@@ -21,8 +22,8 @@ for i in 1:100
2122
x0 = 0.5+i/30; s0 = 0.5/(i/20);
2223
edges = collect(-5:0.1:5)
2324
centers = collect(edges[1:end-1] .+0.05)
24-
histvals = [exp(-((c-x0)/s0)^2) for c = centers]
25+
histvals = s0 * randn(length(centers)) .+ x0
2526
data_tuple = (edges, histvals)
26-
log_histogram(logger, "histogram/explicitinterface/autobin", rand(10).+0.1*i, step = i) #automatic bins
27+
log_histogram(logger, "histogram/explicitinterface/autobin", s0 .* randn(100) .+ x0, step = i) #automatic bins
2728
log_histogram(logger, "histogram/explicitinterface/manualbin", data_tuple, step = i) #manual bins
2829
end

examples/Scalars.jl

+25
Original file line numberDiff line numberDiff line change
@@ -23,3 +23,28 @@ with_logger(logger) do
2323
@info "scalar/complex" y = z
2424
end
2525
end
26+
27+
28+
################control step increments with context################
29+
with_logger(logger) do
30+
for epoch in 1:10
31+
for i=1:100
32+
# increments global_step by default
33+
with_TBLogger_hold_step() do
34+
# all of these are logged at the same global_step
35+
# and the logger global_step is only then increased
36+
@info "train1/scalar" val=i
37+
@info "train2/scalar" val2=i/2
38+
@info "train3/scalar" val3=100-i
39+
end
40+
end
41+
# step increment at end can be disabled for easy train/test sync
42+
with_TBLogger_hold_step(;step_at_end=false) do
43+
# all of these are logged at the same global_step
44+
# and the logger global_step is only then increased
45+
@info "test1/scalar" epoch=epoch
46+
@info "test2/scalar" epoch2=epoch^2
47+
@info "test3/scalar" epoch3=epoch^3
48+
end
49+
end
50+
end

gen/Project.toml

+1-1
Original file line numberDiff line numberDiff line change
@@ -4,5 +4,5 @@ FilePathsBase = "48062228-2e41-5def-b9a4-89aafe57970f"
44
Glob = "c27321d9-0574-5035-807b-f59d2c89b15c"
55
ProtoBuf = "3349acd9-ac6a-5e09-bcdb-63829b23a429"
66

7-
[comapt]
7+
[compat]
88
ProtoBuf = "0.9.1"

gen/compile_proto.jl

+18-36
Original file line numberDiff line numberDiff line change
@@ -21,63 +21,45 @@ pbpath =dirname(dirname(PosixPath(pathof(ProtoBuf))))/p"gen"
2121
cur_path = cwd()
2222
TBL_root = dirname(cur_path)
2323

24-
src_dir = cur_path/"proto"
25-
out_dir = cur_path/"protojl"
24+
# src_dir = cur_path/"proto"
25+
src_dir = PosixPath(".")/"proto"
26+
out_dir = cur_path/"protojl"
2627

2728
## Clean the output directory
2829
rm(out_dir, force=true, recursive=true)
2930

3031
## First module
3132
function process_module(cur_module::AbstractString; base_module::AbstractString=cur_module, input_path=cur_module)
32-
# Include search paths
33-
includes = [src_dir, src_dir/base_module]
3433

3534
# Output folder
36-
module_out_dir = out_dir/cur_module
35+
module_out_dir = out_dir/cur_module
3736

3837
# Input files
39-
infiles = glob("*.proto", src_dir/input_path)
38+
infiles = split.(string.(glob("*.proto", src_dir/input_path)), '/') .|> (a -> a[3:end]) .|> a -> joinpath(a...)
4039

4140
mkpath(module_out_dir)
42-
includes_str=["--proto_path=$path" for path=includes]
43-
run(ProtoBuf.protoc(`$includes_str --julia_out=$module_out_dir $infiles`))
44-
45-
nothing
41+
relative_paths = string.(infiles)
42+
search_directories = joinpath(@__DIR__, "proto")
43+
output_directory = string(module_out_dir)
44+
# println("relative_paths=$relative_paths")
45+
# println("search_directories=$search_directories")
46+
# println("output_directory=$output_directory")
47+
ProtoBuf.protojl(relative_paths ,search_directories ,output_directory)
48+
files_to_include = [string(module_out_dir/basename(file)) for file in infiles]
49+
return files_to_include
4650
end
4751

4852
#process_module("tensorflow", input_path="tensorflow/core/protobuf")
4953

50-
process_module("tensorboard", input_path="tensorboard/compat/proto")
54+
files_to_include = process_module("tensorboard", input_path="tensorboard/compat/proto")
5155

5256
#plugins = ["audio", "custom_scalar", "hparams", "histogram", "image", "scalar", "text"]
5357
plugins = ["custom_scalar", "hparams", "text"]
54-
for plugin in plugins
55-
process_module("tensorboard/plugins/$plugin", base_module="tensorboard")
56-
end
5758

59+
append!(files_to_include, (process_module("tensorboard/plugins/$plugin", base_module="tensorboard") for plugin in plugins)...)
5860

59-
## this fails but would be better
60-
#cur_module = "tensorboard"
61-
#base_module = cur_module
62-
#
63-
## Include search paths
64-
#includes = [src_dir, src_dir/base_module]
65-
#
66-
## Output folder
67-
#module_out_dir = out_dir/("$cur_module"*"2")
68-
#
69-
## Input files
70-
#infiles = glob("*.proto", src_dir/cur_module/"compat/proto")
71-
#
72-
#for plugin in plugins
73-
# plugin_proto_files = glob("*.proto", src_dir/cur_module/"plugins/$plugin")
74-
# append!(infiles, plugin_proto_files)
75-
#end
76-
#
77-
#mkpath(module_out_dir)
78-
#includes_str=["--proto_path=$path" for path=includes]
79-
#run(ProtoBuf.protoc(`$includes_str --julia_out=$module_out_dir $infiles`))
80-
61+
# files_to_include contains all the proto files, can be used for printing and inspection
62+
println("generated code for \n$files_to_include")
8163

8264
# Finally move the output directory to the src folder
8365
mv(out_dir, TBL_root/"src"/"protojl")

0 commit comments

Comments
 (0)