- Lars B.
- Bertil T.
In this repository, we will keep all homeworks for SP4E
See folder named *hw1-conjugate-gradient*.
The code as been tested with Python 3.
Run the *main_minimize.py* with the following arguments:
- -A < matrix elements in row major order >
- -b < vector elements >
- -x0 < initial guess elements >
- -method < CG-scipy or CG-ours >
- -plot < True or False >
The following command will run the program with the coefficients given in Exercise 1:
python3 main_minimize.py -A 8 0 2 6 -b 0 1 -x0 4 4 -method CG-scipy -plot True
NB: Note that the quadratic function in Exercise 1 is implemented with a multiplicative factor 1/2 in order to be consistent with Exercise 2.
The following command will run the program with a 2x2 s.p.d. matrix A and 2-dim vector b using our self-implemented conjugate gradient method for solving the LSE Ax=b:
python3 main_minimize.py -A 3 0 0 4 -b 4 5 -x0 12 12 -plot True -method CG-ours
NB: Note that A must be s.p.d. in order for the conjugate gradient method to correctly solve Ax=b. Thus, the matrix A in Exercise 1 should not be used to compare the Scipy version (CG-scipy) againts our version (CG-ours). In the output of Exercise 1, you will see that it does not solve the LSE Ax=b (i.e., the residual outputted is high).
NB: Plotting can only be accomplished in 1D (A and b are both scalars) or 2D (A is a 2x2 matrix and b is a 2-dim vector) as it is not possible to plot higher dimensional problems in a way that makes sense.
See the homework2/ folder and the instructions within its own README file.