Phriction Projects Wikis Bioimaging And Optics Platform Computers & Servers at the BIOP Software TensorFlow GPU & CARE History Version 19 vs 20
Version 19 vs 20
Version 19 vs 20
Content Changes
Content Changes
Installing TensorFlow with GPU Support for Windows 10 and probably others.
# Goal
At the BIOP we wish to host [[ https://csbdeep.bioimagecomputing.com/ | CARE]], [[ https://github.com/juglab/n2v| Noise2Void ]] and [[ URL | Stardist]] on our workstations (But other Deep learning models and tools based on Keras and TensorFlow will also work.
# No to Conda
Most of the configurations for deep learning make use of Conda or Miniconda, which is nice and powerful, but more often than make it hard to manage for us because:
1. Conda manages certain packages, and not others (especially obscure scientific packages), so we need to keep up to date what we install with `conda` and what we install with `pip`.
2. We are managing sessions on Windows machines where many different users are logging in and might need the same environements (or not). Managing these with `conda` is tedious.
3. Conda is //heavy// for what we need.
NOTE: One nice thing about `conda` is that it apparently handles the installation of the pre-requisites, which we need to do (only once per physical machine) by hand in the following protocol. But it does not offset the pros from our approach.
# Yes to Python `virtualenv`
Python virtualenv turned out to be a great tool for managing what we needed and, thanks to CSBDeep's clean infrastructure, a breeze to manage.
1. A `virtualenv` is merely one folder somewhere. Nothing is managed centrally so if we need to delete it, we simply delete the folder.
2. We need only use `pip` for management.
3. It is very lightweight (at least until you dump all the dependencies we need).
It is very easy to create environements for all the tools we wish to use, and we can create folders for each user and tailor them as we need. After that, a `pip freeze > requirements.txt` is enough to give it to the user if they want to have it on their own workstation.
IMPORTANT: We can help setup user workstations, but we will not try and debug issues with CUDA or cuDNN. This works for us in Win10 x64 with RTX 2080 Ti cards, and we have not done extensive tests. We are not IT support either.
(WARNING) Protocol was made by Olivier Burri, BIOP. Last update: November 2019
= Prerequisites =
# Get latest [[https://www.geforce.com/drivers|NVIDIA drivers]]
# Get [[https://developer.nvidia.com/cuda-10.0-download-archive|CUDA 10.0 Toolkit]]
# Get [[https://developer.nvidia.com/rdp/cudnn-download|cuDNN for CUDA 10.0]] (Needs you to login)
# Get [[https://www.python.org/downloads/|Python 3.7 x64]]
# Get [[https://nodejs.org/en/|NodeJS ]] for managing Jupyter Lab
Install everything with defaults except Python. Make sure to add Python to the PATH when offered to.
Everything installs automatically **except for cuDNN**. For cuDNN, copy paste the contents of the zip file into
`C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0`
WARNING: You currently cannot go higher than CUDA 10.0, so make sure you downloaded that version!
== Installation of `pip` and `virtualenv` ==
After installing Python, update `pip` and `virtualenv`
Start `cmd` as an administrator and use
```
python -m pip install --upgrade pip
pip install virtualenv
```
= Create a CARE virutalenv =
NOTE: For this example, we want to create a `CARE-TF-GPU` folder on the `D:\` drive
== 1. Create the virtual environement==
```
virtualenv -p python D:\CARE-TF-GPU\
```
== 2. Activate the virtual environement ==
```
d:
cd CARE-TF-GPU
Scripts\activate
```
== 3. Install the required dependencies ==
```
pip install tensorflow-gpu==1.14 nodejs csbdeep scikit-image jupyterlab jupyter_tensorboard ipywidgets widgetsnbextension
```
This will install Jupyter Lab and the most recent CARE package available to `pip`.
NOTE: We include `ipywidgets` because we use them on //our// notebooks. `jupyter_tensorboard` is not installed by `csbdeep` so you need to explicitely install it as well. Sometimes we make use of `scikit-image` when reloading data, which is why it is there. `nodejs` is needed for Jupyter Lab and to install the extensions in step 5.
== 4. Check that the `jupyter notebook` is loading the right kernel with ==
```
jupyter kernelspec list
```
And your output should look like this
```
Available kernels:
python3 d:\care-tf-gpu\share\jupyter\kernels\python3
```
== 5. Install the JupyterLab extensions ==
```
jupyter labextension install @jupyter-widgets/jupyterlab-manager & jupyter labextension install jupyterlab_tensorboard
```
= 6. Test =
Launch Jupyter Notebook with
```
jupyter lab
```
Make a new Notebook and run a new cell with
```lang=python
import tensorflow as tf
with tf.device('/gpu:0'):
a = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[2, 3], name='a')
b = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[3, 2], name='b')
c = tf.matmul(a, b)
with tf.Session() as sess:
print (sess.run(c))
```
If that works, then TensorFlow is running for you.
= Starting a session =
Navigate to the folder where you created the virtual environement and go to the command line.
```
Scripts/activate
jupyter lab
```
This should open your default browser and launch Jupyter Lab
= Magic BAT file for Installation =
All the steps we performed here are enclosed in the BAT script below.
1. Copy the lines below.
2. Create a `install-care.bat` file and copy them inside.
3. Modify the `installPath` to where you want to install the virtual environement (**avoid spaces in the name!**).
4. Save and run the `install-care.bat`. No admin rights are needed, normally.
```
@echo off
REM Install Path for the Virtual Environement
set installPath=D:\CARE-TF-GPU
echo %installPath%
REM Make sure pip is installed and up to date
python -m pip install --upgrade pip
REM Install virtualenv if it's not there yet
pip install virtualenv
REM Finally, create the dsesired virtual environement
virtualenv -p python %installPath%
REM in order: activate the environement, install all dependencies and test the kernel location
cmd /k "cd /D %installPath%\Scripts\ & activate & pip install tensorflow-gpu nodejs csbdeep jupyterlab jupyter_tensorboard ipywidgets widgetsnbextension scikit-image & jupyter labextension install @jupyter-widgets/jupyterlab-manager &jupyter labextension install jupyterlab_tensorboard & jupyter kernelspec list
PAUSE
```
= Magic BAT file for running (add as shortcut in desktop, for instance) =
```
@echo off
cmd /k "cd /d D:\CARE-TF-GPU & Scripts\activate & jupyter lab"
```
Installing TensorFlow with GPU Support for Windows 10 and probably others.
# Goal
At the BIOP we wish to host [[ https://csbdeep.bioimagecomputing.com/ | CARE]], [[ https://github.com/juglab/n2v| Noise2Void ]] and [[ URL | Stardist]] on our workstations (But other Deep learning models and tools based on Keras and TensorFlow will also work.
# No to Conda
Most of the configurations for deep learning make use of Conda or Miniconda, which is nice and powerful, but more often than make it hard to manage for us because:
1. Conda manages certain packages, and not others (especially obscure scientific packages), so we need to keep up to date what we install with `conda` and what we install with `pip`.
2. We are managing sessions on Windows machines where many different users are logging in and might need the same environements (or not). Managing these with `conda` is tedious.
3. Conda is //heavy// for what we need.
NOTE: One nice thing about `conda` is that it apparently handles the installation of the pre-requisites, which we need to do (only once per physical machine) by hand in the following protocol. But it does not offset the pros from our approach.
# Yes to Python `virtualenv`
Python virtualenv turned out to be a great tool for managing what we needed and, thanks to CSBDeep's clean infrastructure, a breeze to manage.
1. A `virtualenv` is merely one folder somewhere. Nothing is managed centrally so if we need to delete it, we simply delete the folder.
2. We need only use `pip` for management.
3. It is very lightweight (at least until you dump all the dependencies we need).
It is very easy to create environements for all the tools we wish to use, and we can create folders for each user and tailor them as we need. After that, a `pip freeze > requirements.txt` is enough to give it to the user if they want to have it on their own workstation.
IMPORTANT: We can help setup user workstations, but we will not try and debug issues with CUDA or cuDNN. This works for us in Win10 x64 with RTX 2080 Ti cards, and we have not done extensive tests. We are not IT support either.
(WARNING) Protocol was made by Olivier Burri, BIOP. Last update: November 2019
= Prerequisites =
# Get latest [[https://www.geforce.com/drivers|NVIDIA drivers]]
# Get [[https://developer.nvidia.com/cuda-10.0-download-archive|CUDA 10.0 Toolkit]]
# Get [[https://developer.nvidia.com/rdp/cudnn-download|cuDNN for CUDA 10.0]] (Needs you to login)
# Get [[https://www.python.org/downloads/|Python 3.7 x64]]
# Get [[https://nodejs.org/en/|NodeJS ]] for managing Jupyter Lab
Install everything with defaults except Python. Make sure to add Python to the PATH when offered to.
Everything installs automatically **except for cuDNN**. For cuDNN, copy paste the contents of the zip file into
`C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0`
WARNING: You currently cannot go higher than CUDA 10.0, so make sure you downloaded that version!
== Stardist Prerequisites ==
IMPORTANT: For Stardist You have an additional prerequisite!
For Stardist to compile with `pip`, you need to install [[ https://visualstudio.microsoft.com/downloads/#build-tools-for-visual-studio-2019 | Build Tools for Visual Studio 2019 ]]
The minimum** individual components** that you need are the following:
# `MSVC v142 - VS 2019 C++ x64/x86 build tools`
# `Windows 10 SDK (10.0.18362.0)`
After this, you may need to restart your PC.
== Installation of `pip` and `virtualenv` ==
After installing Python, update `pip` and `virtualenv`
Start `cmd` as an administrator and use
```
python -m pip install --upgrade pip
pip install virtualenv
```
= Create a CARE virutalenv =
NOTE: For this example, we want to create a `CARE-TF-GPU` folder on the `D:\` drive
== 1. Create the virtual environement==
```
virtualenv -p python D:\CARE-TF-GPU\
```
== 2. Activate the virtual environement ==
```
d:
cd CARE-TF-GPU
Scripts\activate
```
== 3. Install the required dependencies ==
```
pip install tensorflow-gpu==1.14 nodejs csbdeep scikit-image jupyterlab jupyter_tensorboard ipywidgets widgetsnbextension
```
This will install Jupyter Lab and the most recent CARE package available to `pip`.
NOTE: We include `ipywidgets` because we use them on //our// notebooks. `jupyter_tensorboard` is not installed by `csbdeep` so you need to explicitely install it as well. Sometimes we make use of `scikit-image` when reloading data, which is why it is there. `nodejs` is needed for Jupyter Lab and to install the extensions in step 5.
== 4. Check that the `jupyter notebook` is loading the right kernel with ==
```
jupyter kernelspec list
```
And your output should look like this
```
Available kernels:
python3 d:\care-tf-gpu\share\jupyter\kernels\python3
```
== 5. Install the JupyterLab extensions ==
```
jupyter labextension install @jupyter-widgets/jupyterlab-manager & jupyter labextension install jupyterlab_tensorboard
```
= 6. Test =
Launch Jupyter Notebook with
```
jupyter lab
```
Make a new Notebook and run a new cell with
```lang=python
import tensorflow as tf
with tf.device('/gpu:0'):
a = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[2, 3], name='a')
b = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[3, 2], name='b')
c = tf.matmul(a, b)
with tf.Session() as sess:
print (sess.run(c))
```
If that works, then TensorFlow is running for you.
= Starting a session =
Navigate to the folder where you created the virtual environement and go to the command line.
```
Scripts/activate
jupyter lab
```
This should open your default browser and launch Jupyter Lab
= Magic BAT file for Installation =
All the steps we performed here are enclosed in the BAT script below.
1. Copy the lines below.
2. Create a `install-care.bat` file and copy them inside.
3. Modify the `installPath` to where you want to install the virtual environement (**avoid spaces in the name!**).
4. Save and run the `install-care.bat`. No admin rights are needed, normally.
```
@echo off
REM Install Path for the Virtual Environement
set installPath=D:\CARE-TF-GPU
echo %installPath%
REM Make sure pip is installed and up to date
python -m pip install --upgrade pip
REM Install virtualenv if it's not there yet
pip install virtualenv
REM Finally, create the dsesired virtual environement
virtualenv -p python %installPath%
REM in order: activate the environement, install all dependencies and test the kernel location
cmd /k "cd /D %installPath%\Scripts\ & activate & pip install tensorflow-gpu nodejs csbdeep jupyterlab jupyter_tensorboard ipywidgets widgetsnbextension scikit-image & jupyter labextension install @jupyter-widgets/jupyterlab-manager &jupyter labextension install jupyterlab_tensorboard & jupyter kernelspec list
PAUSE
```
= Magic BAT file for running (add as shortcut in desktop, for instance) =
```
@echo off
cmd /k "cd /d D:\CARE-TF-GPU & Scripts\activate & jupyter lab"
```
Installing TensorFlow with GPU Support for Windows 10 and probably others.
# Goal
At the BIOP we wish to host [[ https://csbdeep.bioimagecomputing.com/ | CARE]], [[ https://github.com/juglab/n2v| Noise2Void ]] and [[ URL | Stardist]] on our workstations (But other Deep learning models and tools based on Keras and TensorFlow will also work.
# No to Conda
Most of the configurations for deep learning make use of Conda or Miniconda, which is nice and powerful, but more often than make it hard to manage for us because:
1. Conda manages certain packages, and not others (especially obscure scientific packages), so we need to keep up to date what we install with `conda` and what we install with `pip`.
2. We are managing sessions on Windows machines where many different users are logging in and might need the same environements (or not). Managing these with `conda` is tedious.
3. Conda is //heavy// for what we need.
NOTE: One nice thing about `conda` is that it apparently handles the installation of the pre-requisites, which we need to do (only once per physical machine) by hand in the following protocol. But it does not offset the pros from our approach.
# Yes to Python `virtualenv`
Python virtualenv turned out to be a great tool for managing what we needed and, thanks to CSBDeep's clean infrastructure, a breeze to manage.
1. A `virtualenv` is merely one folder somewhere. Nothing is managed centrally so if we need to delete it, we simply delete the folder.
2. We need only use `pip` for management.
3. It is very lightweight (at least until you dump all the dependencies we need).
It is very easy to create environements for all the tools we wish to use, and we can create folders for each user and tailor them as we need. After that, a `pip freeze > requirements.txt` is enough to give it to the user if they want to have it on their own workstation.
IMPORTANT: We can help setup user workstations, but we will not try and debug issues with CUDA or cuDNN. This works for us in Win10 x64 with RTX 2080 Ti cards, and we have not done extensive tests. We are not IT support either.
(WARNING) Protocol was made by Olivier Burri, BIOP. Last update: November 2019
= Prerequisites =
# Get latest [[https://www.geforce.com/drivers|NVIDIA drivers]]
# Get [[https://developer.nvidia.com/cuda-10.0-download-archive|CUDA 10.0 Toolkit]]
# Get [[https://developer.nvidia.com/rdp/cudnn-download|cuDNN for CUDA 10.0]] (Needs you to login)
# Get [[https://www.python.org/downloads/|Python 3.7 x64]]
# Get [[https://nodejs.org/en/|NodeJS ]] for managing Jupyter Lab
Install everything with defaults except Python. Make sure to add Python to the PATH when offered to.
Everything installs automatically **except for cuDNN**. For cuDNN, copy paste the contents of the zip file into
`C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v10.0`
WARNING: You currently cannot go higher than CUDA 10.0, so make sure you downloaded that version!
== Stardist Prerequisites ==
IMPORTANT: For Stardist You have an additional prerequisite!
For Stardist to compile with `pip`, you need to install [[ https://visualstudio.microsoft.com/downloads/#build-tools-for-visual-studio-2019 | Build Tools for Visual Studio 2019 ]]
The minimum** individual components** that you need are the following:
# `MSVC v142 - VS 2019 C++ x64/x86 build tools`
# `Windows 10 SDK (10.0.18362.0)`
After this, you may need to restart your PC.
== Installation of `pip` and `virtualenv` ==
After installing Python, update `pip` and `virtualenv`
Start `cmd` as an administrator and use
```
python -m pip install --upgrade pip
pip install virtualenv
```
= Create a CARE virutalenv =
NOTE: For this example, we want to create a `CARE-TF-GPU` folder on the `D:\` drive
== 1. Create the virtual environement==
```
virtualenv -p python D:\CARE-TF-GPU\
```
== 2. Activate the virtual environement ==
```
d:
cd CARE-TF-GPU
Scripts\activate
```
== 3. Install the required dependencies ==
```
pip install tensorflow-gpu==1.14 nodejs csbdeep scikit-image jupyterlab jupyter_tensorboard ipywidgets widgetsnbextension
```
This will install Jupyter Lab and the most recent CARE package available to `pip`.
NOTE: We include `ipywidgets` because we use them on //our// notebooks. `jupyter_tensorboard` is not installed by `csbdeep` so you need to explicitely install it as well. Sometimes we make use of `scikit-image` when reloading data, which is why it is there. `nodejs` is needed for Jupyter Lab and to install the extensions in step 5.
== 4. Check that the `jupyter notebook` is loading the right kernel with ==
```
jupyter kernelspec list
```
And your output should look like this
```
Available kernels:
python3 d:\care-tf-gpu\share\jupyter\kernels\python3
```
== 5. Install the JupyterLab extensions ==
```
jupyter labextension install @jupyter-widgets/jupyterlab-manager & jupyter labextension install jupyterlab_tensorboard
```
= 6. Test =
Launch Jupyter Notebook with
```
jupyter lab
```
Make a new Notebook and run a new cell with
```lang=python
import tensorflow as tf
with tf.device('/gpu:0'):
a = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[2, 3], name='a')
b = tf.constant([1.0, 2.0, 3.0, 4.0, 5.0, 6.0], shape=[3, 2], name='b')
c = tf.matmul(a, b)
with tf.Session() as sess:
print (sess.run(c))
```
If that works, then TensorFlow is running for you.
= Starting a session =
Navigate to the folder where you created the virtual environement and go to the command line.
```
Scripts/activate
jupyter lab
```
This should open your default browser and launch Jupyter Lab
= Magic BAT file for Installation =
All the steps we performed here are enclosed in the BAT script below.
1. Copy the lines below.
2. Create a `install-care.bat` file and copy them inside.
3. Modify the `installPath` to where you want to install the virtual environement (**avoid spaces in the name!**).
4. Save and run the `install-care.bat`. No admin rights are needed, normally.
```
@echo off
REM Install Path for the Virtual Environement
set installPath=D:\CARE-TF-GPU
echo %installPath%
REM Make sure pip is installed and up to date
python -m pip install --upgrade pip
REM Install virtualenv if it's not there yet
pip install virtualenv
REM Finally, create the dsesired virtual environement
virtualenv -p python %installPath%
REM in order: activate the environement, install all dependencies and test the kernel location
cmd /k "cd /D %installPath%\Scripts\ & activate & pip install tensorflow-gpu nodejs csbdeep jupyterlab jupyter_tensorboard ipywidgets widgetsnbextension scikit-image & jupyter labextension install @jupyter-widgets/jupyterlab-manager &jupyter labextension install jupyterlab_tensorboard & jupyter kernelspec list
PAUSE
```
= Magic BAT file for running (add as shortcut in desktop, for instance) =
```
@echo off
cmd /k "cd /d D:\CARE-TF-GPU & Scripts\activate & jupyter lab"
```
c4science · Help