R9482/Homework198d9d1d11069master
README.md
SP4E - Homework 1
General info
This file provides a brief documentation and information related to the first Homework of the course Scientific Programming for Engineers, fall 2019.
This homework is done by O. Ashtari and A. Sieber
Last update: 16.10.2019
Project Description
This project is intended to solve a minimization problem on n-dimensional quadratic functions defined as S(X)= XᵀAX - Xᵀb where Xᵀ =[x1, x2, ..., xn], T represents the transpose operation, A is an n by n square matrix and b a column vector of size n.
Requirements
The project is created with Python 3.7. It moreover requires the following libraries to work properly:
- numpy
- argparse
- scipy
- matplotlib
- sys
- math
How to Use
Program 1: optimizer.py
This script solves minimization problems using the scipy.optimize.minimize routine. The minimization solver can be specified as an input argumeent by entering the following command line:
$ python3 optimizer.py method
Where the variable method is one of the following:
- Nelder-Mead
- Powell
- CG (default)
- BFGS
- L-BFGS-B
- TNC
- SLSQP
The initial guess of the minimization process has to be specified directly in the optimizer.py file by changing the value of the variable X0. The matrix A and vector B can also be modified directly in the file.
conjugate_gradient.py
post_processing.py
Post-processing file that takes as inputs the value of A, B as well as the intermediate solutions of the iterative minimization process and the method used. The file generates a 3D plot displaying the function to be minimized as well as the intermediate solutions.
Program 2: conjugate_gradient.py
Based on the problem description in the homework sheet, the objective function here is:
S(X)= ½(XᵀAX) - Xᵀb
Therefore, the user should note the factor 1/2 here which means definition of matrix A here is different from that in Program 1.
To run the program, the objective function should be defined by getting matrix A and vector b. The code should be provided with an starting point x₀ as well. These three are mandatory arguments, followed after -A, -b, and -x0 respectively which should be entered in rowmajor format separated by space. For example the command below runs the program for: A=[[4, 1], [1, 3]] , bᵀ=[1,2] , x₀ᵀ=[2,1]
$ python3 conjugate_gradient.py -A 4 1 1 3 -b 1 2 -x0 2 1
There are also some optional arguments for controlling the program, visualization, and comparison:
- -Nmax, followed by an integer number determines maximum number of iterations. by default Nmax=20.
- -criteria, followed by a float number determines the convergence criteria to stop the iterative optimization. by default, criteria=1e-6.
- -detail is a flag which tells the program to show the result after each step.
- -plot is a flag which tells the program to graphically show evolution of the sake after optimum point on the contour of the objective function. This flag only works for 2D problems, i.e. S=S(x1, x2). Using this option, the program saves the 3D view as well as its two projection as .PDF files.
- -scipy is a flag which asks the program to solve the same problem using scipy built-in function for conjugate gradient minimization and show the results for comparison.
- -h is a flag which displays the help and list of arguments and flags
For example, the previous command line using optional flags and arguments will lead to:
$ python conjugate_gradient.py -A 4 1 1 3 -b 1 2 -x0 2 1 -Nmax 5 -criteria 1e-12 -detail -plot -scipy ======================Summary====================== Objective function: S = 0.5 * transpose(x) * A * x - transpose(x) * b A = [[4. 1.] [1. 3.]] b = [1. 2.] --------------------------------------------------- Step: 0 Residual: 8.54400374531753 Solution(x): [2. 1.] Objective: 7.5 --------------------------------------------------- Step: 1 Residual: 0.8001937042442401 Solution(x): [0.23564955 0.33836858] Objective: -0.5498489425981874 ==================Final Result===================== Number of steps: 2 Residual: 5.551115123125783e-17 Optimum x: [0.09090909 0.63636364] Optimum objective: -0.6818181818181817 =================================================== ==================Scipy Results===================== fun: -0.6818181818181795 jac: array([-9.68575478e-08, -4.47034836e-08]) message: 'Optimization terminated successfully.' nfev: 20 nit: 2 njev: 5 status: 0 success: True x: array([0.09090906, 0.63636363]) ====================================================
By running the program, four separate blocks are displayed:
- First, under Summary the objective function is shown to emphasize again on the notation used. Then A and b are shown to let the user check the inputs which define the problem.
- If the flag -detail is entered, status of the sought after point is shown at all steps which includes iteration number, residual (L2-norm of the gradient vector), position of the point, and the corresponding S(X).
- Under Final Result, the same data (number of iterations, final residual, optimum xᵀ, and the optimum value of the objective function) are displayed.
- If the flag -scipy is entered, the result from solving the same problem using conjugate gradient (CG) algorithm in scipy.optimize.minimize is also shown to let the user compare results from the written code with what the built-in solver gives.