diff --git a/Homework1/README.md b/Homework1/README.md index 7f2a8b6..1bb792c 100644 --- a/Homework1/README.md +++ b/Homework1/README.md @@ -1,196 +1,196 @@ # SP4E - Homework 1 ---- ## General Info This file provides a brief documentation and information related to the first Homework of the course "Scientific Programming for Engineers", fall 2019. This homework is done by **O. Ashtari** and **A. Sieber**. Last update: 16.10.2019 ---- ## Project Description This project is intended to solve a minimization problem on n-dimensional quadratic functions defined as S(X)= XᵀAX - Xᵀb where Xᵀ =[x1, x2, ..., xn], T represents the transpose operation, A is an n by n square matrix, and b a column vector of size n. ---- ## Requirements The project is created using Python 3.7. It moreover requires the following libraries to work properly: * numpy * argparse * scipy * matplotlib * sys * math ---- ## Program 1: optimizer.py Based on the first problem description in the homework sheet, the objective function here is: S(X)= XᵀAX - Xᵀb -Therefore, the user should note that definition of matrix A here differs from that in Program 2 in the factor 1/2. The goal here is using `scipy.optimize.minimize` routine to optimize the aforementioned function for a specific matrix A=[[4, 0], [1, 3]] and a specific vector bᵀ=[0, 1]: +Therefore, the user should note that definition of matrix A here differs from that in Program 2 in the factor 1/2. The goal here is using the `scipy.optimize.minimize` routine to optimize the aforementioned function for a specific matrix A=[[4, 0], [1, 3]] and a specific vector bᵀ=[0, 1]: ### How to Use Basically, the program works with the simple following command line for a default method (conjugate gradients (CG)) and a default initial guess (x₀ᵀ=[3, 2]): ``` $ python optimizer.py ``` However, both method and initial guess can be entered by the user. To that end, `-method` should be followed by the name of the method of choice, and `-x0` should be followed by first and second elements of the initial guess, separated by space. An example is shown below: ``` $ python optimizer.py -method BFGS -x0 0 4 ``` Method can be each of the following items: * Nelder-Mead * Powell * CG (default) * BFGS * L-BFGS-B * TNC * SLSQP -The program, finally, draws evolution of the to-be-detected optimum point on the contour of the objective function. The program saves the 3D view as well as its 2D projection as .pdf files. +The program, finally, draws the evolution of the minimization process (red dots) on top of the contour plot of the objective function. The program saves the 3D view as well as its 2D projection as .pdf files. -**N.B.:** some systems require the user to type in `python3` instead of `python` in the command line. +**N.B.**: some systems may require the user to type in `python3` instead of `python` in the command line. ### Example: The previous command line using optional flags and arguments will display the following information on the terminal: ``` $ python optimizer.py -method BFGS -x0 0 4 ======================Summary====================== Objective function: S = transpose(x) * A * x - transpose(x) * b A = [[4 0] [1 3]] b = [0 1] method: BFGS --------------------------------------------------- fun: -0.08510638297872285 hess_inv: array([[ 0.12765957, -0.0212766 ], [-0.0212766 , 0.17021277]]) jac: array([-1.02445483e-08, -1.86264515e-08]) message: 'Optimization terminated successfully.' nfev: 24 nit: 4 njev: 6 status: 0 success: True x: array([-0.0212766 , 0.17021276]) ``` -Under "Summary", the objective function expression is shown to emphasize again on the notation used. Then `A`, `b`, and the optimization method which have defined the problem are displayed. Finally, details of the iterative optimization process are shown. Those are the default outputs of the `scipy.optimize.minimize` routine. +Under "Summary", the objective function expression is shown to emphasize again on the notation used. Then `A`, `b`, and the chosen optimization method are displayed. Finally, details of the iterative optimization process are shown. Those are the default outputs of the `scipy.optimize.minimize` routine. ---- ## Program 2: conjugate_gradient.py Based on the second problem description in the homework sheet, the objective function here is: S(X)= ½(XᵀAX) - Xᵀb -Therefore, the user should note the factor 1/2 here, which means definition of matrix A here is different from the one of program 1. The goal here is to implement a conjugate gradients algorithm that takes A and b of any (compatible) dimension and finds its extremum (i.e. gradient(S)=0). The problem thus consists in solving the linear system ½(A+Aᵀ)X=b. +The user should note the factor 1/2. As such, the definition of matrix A here is different from the one of program 1. The goal of this program is to implement a conjugate gradients algorithm that takes A and b of any (compatible) dimension and finds its extremum (i.e. gradient(S)=0). The problem thus consists in solving the linear system ½(A+Aᵀ)X=b. ### How to Use -To run the program, the objective function must first be defined through the matrix A and the vector b. The code must also be provided with a starting point x₀. These are mandatory arguments. They are entered in the program through the command line following the respective flags `-A`, `-b`, and `-x0`, in a rowmajor format separated by spaces. For instance, the command line below runs the program for: +To run the program, the objective function must first be defined through the matrix A and the vector b. The code must also be provided with a starting point x₀ (initial guess). These are mandatory arguments. They are entered in the program through the command line following their respective flags `-A`, `-b`, and `-x0`, in a rowmajor format separated by spaces. For instance, the command line below runs the program for: A=[[4, 1], [1, 3]], bᵀ=[1, 2], x₀ᵀ=[2, 1] ``` $ python conjugate_gradient.py -A 4 1 1 3 -b 1 2 -x0 2 1 ``` -Additionally, optional arguments can be provided by the user for tasks such as controlling the program, visualization, and verification: +Additionally, optional arguments can be provided by the user to perform tasks such as controlling the program progression, visualization, and verification. Those additional functionalities are accessed by using the following flags: -* `-Nmax`, followed by an integer number determines maximum number of iterations. By default `Nmax=20`. -* `-criteria`, followed by a float number determines the convergence criteria to stop iterations. By default, `criteria=1e-6`. +* `-Nmax`, followed by an integer number to determine the maximum number of iterations. By default `Nmax=20`. +* `-criteria`, followed by a float number to determine the convergence criteria to stop iterations. By default, `criteria=1e-6`. * `-detail` is a flag which tells the program show the result after each step. -* `-plot` is a flag which tells the program show, graphically, evolution of the to-be-found optimum point on the contour of the objective function. This flag only works for 2D problems, i.e. S=S(x1, x2). Using this option, the program saves the 3D view as well as its 2D projection as .pdf files. +* `-plot` is a flag which tells the program whether or not to use post-processing capabilities through the `post_processing.py` routine. This flag only works for 2D problems, i.e. S=S(x1, x2). Using this option, the program saves the 3D view as well as its 2D projection as .pdf files. * `-scipy` is a flag which asks the program to solve the same problem using `scipy`'s built-in function for conjugate gradient minimization and show the results for comparison. * `-h` is a flag which displays the help and list of arguments and flags. -**N.B.:** some systems require the user to type in `python3` instead of `python` in the command line. +**N.B.**: some systems may require the user to type in `python3` instead of `python` in the command line. ### Example: The previous command line using optional flags and arguments will lead to: ``` $ python conjugate_gradient.py -A 4 1 1 3 -b 1 2 -x0 2 1 -Nmax 5 -criteria 1e-12 -detail -plot -scipy ======================Summary====================== Objective function: S = 0.5 * transpose(x) * A * x - transpose(x) * b A = [[4. 1.] [1. 3.]] b = [1. 2.] --------------------------------------------------- Step: 0 Residual: 8.54400374531753 Solution(x): [2. 1.] Objective: 7.5 --------------------------------------------------- Step: 1 Residual: 0.8001937042442401 Solution(x): [0.23564955 0.33836858] Objective: -0.5498489425981874 ==================Final Result===================== Number of steps: 2 Residual: 5.551115123125783e-17 Optimum x: [0.09090909 0.63636364] Optimum objective: -0.6818181818181817 =================================================== ==================Scipy Results===================== fun: -0.6818181818181795 jac: array([-9.68575478e-08, -4.47034836e-08]) message: 'Optimization terminated successfully.' nfev: 20 nit: 2 njev: 5 status: 0 success: True x: array([0.09090906, 0.63636363]) ==================================================== ``` By running the program, four separate blocks are displayed: -* First, under "Summary", the objective function expression is shown to emphasize again on the notation used. Then `A` and `b` are shown to let the user check the inputs which define the problem. +* First, under "Summary", the objective function expression is shown to emphasize again on the notation used. Then `A` and `b` are shown to let the user check the inputs defining the function to minimize. * Then, if the flag `-detail` is entered, the status of the sought solution is displayed for each iterative step. This status provides information concerning the iteration number, the residual (L2-norm of the gradient vector), the position of the current optimum, and the corresponding value of the original function S(X). * After that, under "Final Result", the total number of iterations, the final residual, the optimum xᵀ, and the extremum value of the objective function are displayed. * Finally, if the flag `-scipy` is entered, the same initial problem is computed using the conjugate gradient (CG) algorithm provided by `scipy.optimize.minimize` and the result is displayed. This is meant to provide a mean of comparison between our implementation of the conjugate gradient algorithm and the one provided by `scipy`. ---- ## Program 3: post_processing.py -The function `plot_results` in this file gets the matrix A and the vector b as inputs (to construct the surfacae S(x,y)) as well as the intermediate solutions of the iterative minimization process (to mark location of the to-be-found optimum point at each step and connect them). This function also gets the method used as a string to display it in the title of each plot. The function generates a 3D plot displaying the to-be-minimized function S as well as the intermediate solutions. This plot is saved as `3D_representation.pdf`. This function also saves the 2D projection of the discussed 3D plot on the x-y plane as `2D_projection.pdf`. +The function `plot_results` in this file gets the matrix A and the vector b as inputs (those are used to construct the surface S(x,y)) as well as the intermediate solutions of the iterative minimization process(to mark location of the to-be-found optimum point at each step). This function also gets the method used as a string to display it in the title of each plot. The function generates a 3D plot displaying the to-be-minimized function S as well as the intermediate solutions. This plot is saved as `3D_representation.pdf`. This function also saves the 2D projection of the discussed 3D plot on the x-y plane as `2D_projection.pdf`. There is no specific instruction for this program since it is used inside programs 1 and 2 and is not designed to be used independently.