\end Question: I'm wondering whether we have a similar formula when the inverse in the Sherman-Morrison formula is replaced by the Moore-Penrose Pseudoinverse in case that $\mathbf A$ is singular matrix.If $\mathbf A$ is symmetric and so is the update to it, then I get that the Sherman-Morrison formula works as is (replacing inverse with pseudo-inverse of course).the Benaych-Georgesa and Nadakuditi paper on "The singular values and vectors of low rank perturbations of large rectangular random matrices (2012)") and I do not think they will help get a solution soon.I would suggest you keep your focus on Image Processing literature.Otherwise, if I am correct, the formula gives you only a general inverse, and correction using the null space is required to make it the desired pseudo-inverse. This might be useful: "Similarly, it is possible to update the Cholesky factor when a row or column is added, without creating the inverse of the correlation matrix explicitly.
Indeed, almost all the information contained here is in some sense duplicating existing textbooks, online resources, or research papers).The basic of this update are dictated by the Sherman-Morrison formula.. \begin A^* = A - UV^T \end the Woodbury formula comes into play.If you see these formulas you will notice that there are lot of inverse involved. As you already solved a great deal of their subsystems already (ie. (The uber-classic: "Matrix Algebra From a Statistician's Perspective" by Harville unluckily does not touch on rank updates at all.) Looking to the statistics/application side of things, rank one updates are common in recommender systems because one may have thousands of customer entries and recomputing the SVD (or any given decomposition for that matter) each time a new user registers or a new product is added or removed is quite wasteful (if not unattainable).Cholesky updates exists (' Ditch' the recommender literature then and focus on image processing.Similar questions with tours have been posted in terms of "new images" in a database.