diff --git a/README.md b/README.md new file mode 100644 index 0000000..4f2b474 --- /dev/null +++ b/README.md @@ -0,0 +1,30 @@ +# Minimization of a scalar function in Python + +(SP4E 2018, Homework 1)
+Authors: Sajjad Azimi, Alessia Ferraro + +## Description and usage + +* **optimizer.py**
+ The program optimizer.py finds the minimum of a given scalar function making use of the scipy.optimize.minimize module. + Via the command line options, the user can choose the minimization algorithm and whether the starting point is randomly + generated (see the helpmode -h):
+ + `-CG` (Conjugate Gradient)
+ `-BFGS` (Broyden–Fletcher–Goldfarb–Shanno)
+ `-RI` to generate a random initial condition
+ + Once the optimization process reaches the required tolerance a plotting routine shows both the surface and the minimization + path of the chosen algorithm. + +* **conjugate_gradient.py**
+To run conjugate_gradient you have to run the following command in the terminal:
+`python conjugate_gradient.py` + + Once you run the command, the processor will use the function which implements the conjugate gradient method to find + the minimum of the function S, given in the HW description. The matrix A and the vector b for this function are calculated + and given directly as np.array to the code. The initial condition is a fixed point at [3.0, 1.0]. + After running the program the number of steps and the final solution are printed on the screen and the 3D plot of the + surface and the path of the point during the convergence are shown in a new window. + The convergence of the point is exactly like the convergence of the point in execise 1 with the optimizer + (with the same initial condition).