R9482/Homework1eb6d363621e1master
README.md
SP4E - Homework 1
General info
This file provides a brief documentation and information related to the first Homework of the course Scientific Programming for Engineers, fall 2019.
This homework is done by O. Ashtari and A. Sieber
Last update: 16.10.2019
Project Description
This project is intended to solve a minimization problem on n-dimensional quadratic functions defined as S(X)= XᵀAX - Xᵀb where Xᵀ =[x1, x2, ..., xn], T represents the transpose operation, A is an n by n square matrix and b a column vector of size n.
Requirements
The project is created with Python 3.7. It moreover requires the following libraries to work properly:
- numpy
- argparse
- scipy
- matplotlib
- sys
- math
How to Use
Program 1: *optimizer.py*
This script solves minimization problems using the scipy.optimize.minimize routine. The minimization solver can be specified as an input argumeent by entering the following command line:
$ python3 optimizer.py method
Where the variable method is one of the following:
- Nelder-Mead
- Powell
- CG (default)
- BFGS
- L-BFGS-B
- TNC
- SLSQP
The initial guess of the minimization process has to be specified directly in the optimizer.py file by changing the value of the variable X0. The matrix A and vector B can also be modified directly in the file.
conjugate_gradient.py
post_processing.py
Post-processing file that takes as inputs the value of A, B as well as the intermediate solutions of the iterative minimization process and the method used. The file generates a 3D plot displaying the function to be minimized as well as the intermediate solutions.
Program 2: conjugate_gradient.py
Based on the problem description in the homework sheet, the objective function here is:
S(X)= ½(XᵀAX) - Xᵀb
Therefore, the user should note the factor 1/2 here which means definition of matrix A here is different from that in Program 1.
To run the program, the objective function should be defined by getting matrix A and vector b. The code should be provided with an starting point x₀ as well. These three are mandatory arguments, followed after -A, -b, and -x0 which should be entered in rowmajor format separated by space. For example the command below runs the program for: A=[[4, 1], [1, 3]] , bᵀ=[1,2] , x₀ᵀ=[2,1]
$ python3 conjugate_gradient.py -A 4 1 1 3 -b 1 2 -x0 2 1 ======================Summary====================== Objective function: S = 0.5 * transpose(x) * A * x - transpose(x) * b A = [[4. 1.] [1. 3.]] b = [1. 2.] ==================Final Result===================== Number of steps: 2 Residual: 5.551115123125783e-17 Optimum x: [0.09090909 0.63636364] Optimum objective: -0.6818181818181817 ===================================================
By running the program, first under Summary the objective function is shown to emphasize again on the notation used. Then A and b are shown to let the user check the inputs which define the problem. Under Final Result, number of iterations, final error that is L2-norm of the gradient vector, optimum xᵀ, and the optimum value of the objective function are displayed.
There are also some optional arguments for controlling the program, visualization, and comparison:
- -Nmax, followed by an integer number determines maximum number of iterations. by default Nmax=20.
- -criteria, followed by a float number determines the convergence criteria to stop the iterative optimization. by default, criteria=1e-6.
- -detail is a flag which tells the program to show the result after each step.
- -plot is a flag which tells the program to graphically show evolution of the sake after optimum point on the contour of the objective function. This flag only works for 2D problems, i.e. S=S(x1, x2). Using this option, the program saves the 3D view as well as its two projection as .PDF files.
- -scipy is a flag which asks the program to solve the same problem using scipy built-in function for conjugate gradient minimization and show the results for comparison.