After several days, it was finally installed. It's all pits that I've trampled on.
Maybe I am not proficient in using linux, and I don't know how to solve the problems in the configuration process. So I want to keep a comprehensive record.
I. Knowledge Points - Adding Environmental Variables
0 View environment variables
You can `export'view all variables. You can filter out the keywords you want to see by `export | grep anaconda `, and here's the keywords you want to grab after grep.
1 Temporary addition
Add at the end
export PATH=$PATH:/home/nnir712/software/protobuf/bin export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/nnir712/software/protobuf/lib
Direct assignment overrides previous variables
2 Permanent additions
` vi ~/.bashrc `Add the environment variables you want to add to the file, such as
This modification requires a restart of the terminal window to take effect
` vi/etc/profile `Also add the environment variables you want to add to the file, such as
This modification requires a reboot of the computer to take effect.
2. caffe Installation in Anconda Environment
Three relatively reliable references
[Ubuntu 16.04 Caffe Installation Step Record (Over-detailed)]
[Caffe Learning (II) - Downloading, Compiling and Installing Caffe (Source Installation Mode)]
Finally, I'm referring to it.[ caffe installation ] () Successful
** All installations are not necessarily in this fixed order. **
Different conda versions also have different personalities. I found that the first time I installed it successfully, the second time I could not do it again.
1. Installation Dependence
### First install skimage. numpy
Unlike them, I installed numpy first. Because I found that caffe needed skimage. There is a problem with incompatibility between numpy and skimage versions [numpy/issues](https://github.com/numpy/numpy/issues/12744). Finally, it is found that 1.15.4 numpy supports skimage
conda install numpy==1.15.4 conda install -c conda-forge scikit-image Activate python to try: from skimage import io,transform
Successful as long as no mistake is reported.`
It is not necessary to install numpy here first. Because the second time I tried it, I found that I had numpy==1.15.4 installed, and skimage could not. It seems conda has been updated.
Direct CONDA install-c conda-forge scikit-image will do, CONDA automatically helped me install the numpy version 1.11.3.
conda install boost hdf5 snappy leveldb lmdb gflags glog
This protobuf is messy. Maybe it's in your environment.
You can first check through `protoc-version', if you can report the publication number, it means installation. But also note that you may need to uninstall and reinstall.
It can be installed in three ways. But I still recommend using conda installation, because I installed this way, it can be used.
In this way, protoc is installed in `usr/bin', and can be checked by `ls/usr/bin'`grep protoc'. ``/ usr / bin ``` is usually added to PATH environment variables, so as long as you install protobuf with apt, all environments will be found, including the sandbox of anaconda, so if you install it again with anaconda, there will be two places to install it, which is very messy.
If not, uninstall first and install again.
sudo apt-get remove libprotobuf-dev sudo apt-get remove protobuf-compiler
sudo apt-get install libprotobuf-dev protobuf-compiler
(2) conda installation
Here is also a reference.[ caffe installation](
``conda install protobuf==3.5.1`` //Add environment variables: ``echo 'export PATH=/home/nnir712/anaconda3/pkgs/libprotobuf-3.5.2-h6f1eeef_0/bin:$PATH' >> ~/.bashrc echo 'export LD_LIBRARY_PATH=/home/nnir712/anaconda3/pkgs/libprotobuf-3.5.2-h6f1eeef_0/lib:$LD_LIBRARY_PATH' >> ~/.bashrc`` //Reactivation: ``source ~/.bashrc`` //Recompile and install protocbuf-python [download protobuf](https://github.com/protocolbuffers/protobuf/releases/tag/v3.5.1) tar -xzvf protobuf-python-3.5.1.tar.gz cd protobuf-3.5.1 cd python python setup.py build python setup.py test python setup.py install
Compiling and Installation
Reference to [source code compilation and installation] (https://blog.csdn.net/wenwenxiong/article/details/53644845)
3. Modify Makefile.config
Fully refer to [caffe installation] (https://zoesxw.github.io/2018/07/14/caffe%E5%AE%89%E8%A3%85/)
Mainly pay attention to python's path settings ANACONDA_HOME, PYTHON_INCLUDE, PYTHON_LIB
Other settings are incorrect, you will be prompted when compiling
## Refer to http://caffe.berkeleyvision.org/installation.html # Contributions simplifying and improving our build system are welcome! # cuDNN acceleration switch (uncomment to build with cuDNN). USE_CUDNN := 1 # CPU-only switch (uncomment to build without GPU support). # CPU_ONLY := 1 # uncomment to disable IO dependencies and corresponding data layers USE_OPENCV := 0 # USE_LEVELDB := 0 # USE_LMDB := 0 # This code is taken from https://github.com/sh1r0/caffe-android-lib # USE_HDF5 := 0 # uncomment to allow MDB_NOLOCK when reading LMDB files (only if necessary) # You should not set this flag if you will be reading LMDBs with any # possibility of simultaneous read and write # ALLOW_LMDB_NOLOCK := 1 # Uncomment if you're using OpenCV 3 OPENCV_VERSION := 3 # To customize your choice of compiler, uncomment and set the following. # N.B. the default for Linux is g++ and the default for OSX is clang++ # CUSTOM_CXX := g++ # CUDA directory contains bin/ and lib/ directories that we need. CUDA_DIR := /usr/local/cuda-10.0 # On Ubuntu 14.04, if cuda tools are installed via # "sudo apt-get install nvidia-cuda-toolkit" then use this instead: # CUDA_DIR := /usr # CUDA architecture setting: going with all of them. # For CUDA < 6.0, comment the *_50 through *_61 lines for compatibility. # For CUDA < 8.0, comment the *_60 and *_61 lines for compatibility. # For CUDA >= 9.0, comment the *_20 and *_21 lines for compatibility. CUDA_ARCH := -gencode arch=compute_30,code=sm_30 \ -gencode arch=compute_35,code=sm_35 \ -gencode arch=compute_50,code=sm_50 \ -gencode arch=compute_52,code=sm_52 \ -gencode arch=compute_60,code=sm_60 \ -gencode arch=compute_61,code=sm_61 \ -gencode arch=compute_61,code=compute_61 # BLAS choice: # atlas for ATLAS (default) # mkl for MKL # open for OpenBlas BLAS := atlas # Custom (MKL/ATLAS/OpenBLAS) include and lib directories. # Leave commented to accept the defaults for your choice of BLAS # (which should work)! # BLAS_INCLUDE := /path/to/your/blas # BLAS_LIB := /path/to/your/blas # Homebrew puts openblas in a directory that is not on the standard search path # BLAS_INCLUDE := $(shell brew --prefix openblas)/include # BLAS_LIB := $(shell brew --prefix openblas)/lib # This is required only if you will compile the matlab interface. # MATLAB directory should contain the mex binary in /bin. # MATLAB_DIR := /usr/local # MATLAB_DIR := /Applications/MATLAB_R2012b.app # NOTE: this is required only if you will compile the python interface. # We need to be able to find Python.h and numpy/arrayobject.h. #PYTHON_INCLUDE := /usr/include/python2.7 \ # /usr/lib/python2.7/dist-packages/numpy/core/include # Anaconda Python distribution is quite popular. Include path: # Verify anaconda location, sometimes it's in root. ANACONDA_HOME := $(HOME)/anaconda3 #PYTHON_INCLUDE := $(ANACONDA_HOME)/include \ # $(ANACONDA_HOME)/include/python3.6m \ # $(ANACONDA_HOME)/lib/python3.6/site-packages/numpy/core/include PYTHON_INCLUDE := $(ANACONDA_HOME)/envs/skimg/include \ $(ANACONDA_HOME)/envs/skimg/include/python3.6m \ $(ANACONDA_HOME)/envs/skimg/lib/python3.6/site-packages/numpy/core/include # Uncomment to use Python 3 (default is Python 2) PYTHON_LIBRARIES := boost_python36 python3.6m # PYTHON_INCLUDE := /usr/include/python3.5m \ # /usr/lib/python3.5/dist-packages/numpy/core/include # We need to be able to find libpythonX.X.so or .dylib. PYTHON_LIB := /usr/lib PYTHON_LIB := $(ANACONDA_HOME)/envs/skimg/lib # Homebrew installs numpy in a non standard path (keg only) # PYTHON_INCLUDE += $(dir $(shell python -c 'import numpy.core; print(numpy.core.__file__)'))/include # PYTHON_LIB += $(shell brew --prefix numpy)/lib # Uncomment to support layers written in Python (will link against Python libs) WITH_PYTHON_LAYER := 1 # Whatever else you find you need goes here. INCLUDE_DIRS := $(PYTHON_INCLUDE) /usr/local/include /usr/include/hdf5/serial LIBRARY_DIRS := $(PYTHON_LIB) /usr/local/lib /usr/lib /usr/lib/x86_64-linux-gnu /usr/lib/x86_64-linux-gnu/hdf5/serial /home/nnir712/anaconda3/envs/skimg/lib/ #/home/nnir712/software/protobufin/include/ # If Homebrew is installed at a non standard location (for example your home directory) and you use it for general dependencies # INCLUDE_DIRS += $(shell brew --prefix)/include # LIBRARY_DIRS += $(shell brew --prefix)/lib # NCCL acceleration switch (uncomment to build with NCCL) # https://github.com/NVIDIA/nccl (last tested version: v1.2.3-1+cuda8.0) # USE_NCCL := 1 # Uncomment to use `pkg-config` to specify OpenCV library paths. # (Usually not necessary -- OpenCV libraries are normally installed in one of the above $LIBRARY_DIRS.) # USE_PKG_CONFIG := 1 # N.B. both build and distribute dirs are cleared on `make clean` BUILD_DIR := build DISTRIBUTE_DIR := distribute # Uncomment for debugging. Does not work on OSX due to https://github.com/BVLC/caffe/issues/171 # DEBUG := 1 # The ID of the GPU that 'make runtest' will use to run unit tests. TEST_GPUID := 0 # enable pretty build (comment to see full commands) Q ?= @
make clean # If compiled before, remove it make all -j8 # j8 uses eight threads to see how many cores are on the machine make test -j8 make runtest -j8 make pycaffe -j8 //Add environment variables: `export PYTHONPATH=$PYTHONPATH:/home/nnir712/software/caffe/python/ export CAFFE_ROOT="/home/nnir712/software/caffe/"` //Success! You can test whether import caffe is successful //Incidentally install opencv: `pip install opencv-python==22.214.171.124 -i https://pypi.douban.com/simple/ pip install opencv-contrib-python==126.96.36.199 -i https://pypi.douban.com/simple/`
## Question 1: Problem of protobuf fatal error: google/protobuf/port_def.inc: No such file or directory `sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libhdf5-serial-dev protobuf-compiler sudo apt-get install --no-install-recommends libboost-all-dev sudo apt-get install libopenblas-dev liblapack-dev libatlas-base-dev sudo apt-get install libgflags-dev libgoogle-glog-dev liblmdb-dev sudo apt-get install libprotobuf-dev libleveldb-dev libsnappy-dev libopencv-dev libboost-all-dev libhdf5-serial-dev libgflags-dev libgoogle-glog-dev liblmdb-dev protobuf-compiler libatlas-base-dev sudo apt-get install python-dev python-pip gfortran` //These steps may prompt you to install the latest version. This will not be installed again. Use protoc --version to see the version number. See if you can find protobuf. If not, uninstall first and install again. `sudo apt-get remove libprotobuf-dev sudo apt-get remove protobuf-compiler ` //Installation: `sudo apt-get install libprotobuf-dev protobuf-compiler` ## Question 2: python path /usr/include/boost/python/detail/wrap_python.hpp:50:11: fatal error: pyconfig.h: No file or directory PYTHON_INCLUDE := $(ANACONDA_HOME)/envs/caffepy36/include \ $(ANACONDA_HOME)/envs/caffepy36/include/python3.6m \ $(ANACONDA_HOME)/envs/caffepy36/lib/python3.6/site-packages/numpy/core/include //No numpy was found sudo apt-get remove libboost-all-dev**** /usr/bin/ld: Can't find -lboost_python36 ![9cc1e421353ae63e64e92365547b058f.png](en-resource://database/819:1) export PYTHONPATH=$PYTHONPATH:/home/nnir712/software/caffe/python/ export CAFFE_ROOT="/home/nnir712/software/caffe/" ImportError: libboost_system.so.1.67.0: cannot open shared object file: No such file or directory [Configured libboost_system.so.1.67.0 But it still shows that it can't be found.](https://blog.csdn.net/qq_33144323/article/details/81951308) `sudo vi /etc/ld.so.conf //Add: / home/nnir712/anaconda3/envs/caffepy36/lib/ sudo ldconfig`
To be continued