2010-10-17

Coloring release

Munipack development notes


Version naming

I had introduced new naming schema for releases. The naming will be inspired by the crucial development feature.

Naming of the current release is emphasizing development of color handling utilities (coloring, color tuning, color space conversions, etc.).


Release period

Last stable release has been issued on 2010, February.
I think the period is really very long.
The current release does not includes color related features,
but many other important improvements. Namely,
one offers a new command-line interface. So the issue naming would be also
like "commander release", "general failure".;)

In addition, I forget many details developed half year ago, of course.

Also many crash reports (via build-in dialog) are due to errors in binary release which I fixed months ago. So it would be more useful to issue binary releases more frequently. In ideal case, one will immediately follow implementation of an important feature development.


Homepage design

I had modernized a schematic design of the homepage. CSS has been extensively used. The design is probably inspired by the design of guide for MS Fortran Powerstation 1994 edition. By my memories.

Also I tried to apply the rules used in classical typography to Web. The detailed application is not possible, but many rules can be simple used. The most visible is limiting number of characters in a single line on 80 characters. It may look little bit uncommonly, but I think the reading is more comfortable.

I've worry about use of float elements (tables and images) in a Web page, which ugly breaks text flow. On the other side, the placement of all the elements on bottom or on separate pages is more disturbing. I'm expecting some progress in future.

The interesting navigation's post about typography (but no idea but from the examples has been used).

I had try to use rules:
  • book-like design
  • column width of 80 characters
  • column width of 80 characters
  • minimized number of used fonts
XML serialization

When I implemented the archive and DND functions, I developed a XML serialization of FITS header, structure and thumbnail data (contents of ~/Munipack). The concept is partially similar to VO table format. The XML files (including thumbnails in png) can be easy pushed over tcp sockets, streams etc.

Sub-processes

This release is also proof of concept for invocation of external binaries within GUI session. The GUI is a controller (GUI wrapper) for the binaries. All working utilities (conversion of raw pictures, coloring, dark, flat averages and corrections) are implemented as sub-processes of GUI (via wxExec function, which is probably just only wrapper of exec(3) family functions under unix).

This concept separates GUI and working parts. Also importantly reduces complexity of full package. The utilities communicates via standard input/output. Therefore, there is a wide flexibility in replacing of required subroutines. Also it would be easy to use of the utilities under another environment (Web, etc).

The successful prove of the concept is probably the most important experience of this release.

Plplot

All plotting has been rewrote to use plplot library. This is just preparation for a more power-full plotting tools.


Displaying improvements

In previous release, images has been rendered by passing of every pixel to routines as single argument. The way looks slow. The rendering needs call of two or more functions (tone + color) which may be slow (in order of percent). More important aspect is display in progress. The extensive calling of events for every pixel to indicate progress is very bad idea. Therefore images was displayed when full frame rendering was completed. That may be stressed for user, which expect results immediately.

Therefore, I rewrote displaying in progress to tiling. The image is split-ted on tiles with suitable 137x137 (fine structure constant) pixels. Hence, rendering function works on whole array (not on single pixels). The rendering results can be displayed continuously (only a few events will usually needs) which is more comfortable for user. The rendering naively simulates a progress bar.

Another improvements is pre-scaling of images. One leads to rendering already scaled image. It importantly speed-up tuning of large pictures.

The speed for normal size gray images is satisfactory. Commonly, eye does not recognize any blinking (flickering). The rendering times are usually under 0.2 sec. Unfortunately, the tiling is clearly visible on color images because transformations are more expensive. Rendering may take seconds.




New installer

The most visible change is use of ESP package manager which provides graphical installer similar to widely known installers of Windows or Mac OS worlds. The installation is pretty straightforward. All files are installed to /opt. There is no possibility to change the place.

By the way, the google-chrome deb package installer does the same job. It is designed to unpack a set of binaries to /opt. The installer also create appropriate links to binaries and run xdg tools to add google-chrome to the global menu and desktop.

The graphical installer does not offers ideal way to install when package system is available. In future, it would be better distribute binary packages (deb,rpm?) as well as.

EPM installer (setup) has included fltk library. Also needs some binary libraries (in proper version number!) included in system (png, jpg and X-libs). Is there a way how to link its statically?




Binary bundle

I spend very long time with developing of a new build script.
The script is relative solid and easy to use. So the generation
of a new binary release binaries will not too bother to me.
See dist/builder.sh.

The linux packaging systems (deb, rpm,... packages)
are really great ones. Unfortunately, there is really many
linux distribution and combination of various libraries
and tools. Distribution of third-party tools is really big
strange not satisfactory solved yet.

The crucial point is library dependency. Theirs placing and presence
is important for run. The library mutual cyclic dependencies can
cause big problems for binary distribution.

So, I has choose the way for the distribution:
  • All files are placed to /opt/munipack. The hard-coded installation directory is required by EPM.
  • The distribution includes libraries out of typical distribution set (wxWidgets and plplot). The libraries also would correctly set compilation options and versions.
  • Some binaries (and .so libraries) sets rpath. Usually the rpath is hard-coded during compilation. The path is different from installation path. Therefore, I replace the rpath by patchelf utility (see http://www.eyrie.org/~eagle/notes/rpath.html).
  • The distribution script is dist/builder.sh

2010-03-04

Continuous motion of Barnard's star

Some time ago, I published the article about visualising of the proper motion of Barnard's star. Now, I had added images of next two years. As we are expecting, the position of the object is nicely predictable.

The last two images (years 2008,2009) are taken during autumns opposite to solstice's images at 2005,2006. Any shift due to parallax is not visible. Simbad gives value for parallax about 0.5" which can be visible with our instruments. By my opinion, when we will more carefully selecting time between exposures.

Barnard's star images has no unique brightnes due to different observing conditions.


Barnard's star motion between 2001 and 2009. Full field of view.

2010-02-11

A Graphical User Interface for Munipack

I had started work on the graphical user interface (GUI) for Munipack year ago. The issue of a first public experimental version including the GUI is ready for downloading. Thus, I think the time to resume of my selected skills with the GUI development is coming now.

Why GUI?

Steps towards development of GUI ripened inside me for a long time. The last story which rouse me was processing of large data sets of images from our digital camera. I scaled by manual way its intensities which was produced a very long sequence of commands with fine-tuning options.

Screenshot of my older terminal

Other reasons:
  • simple complex visual manipulation with FITSes
  • processing of astronomical data by my students during migrating from another platforms
  • it is really shame that we have not a simple astronomical data manipulation tool in 21. century
  • command line utilities needs a lot of experiences to work with
  • I want save my skills with data processing (any kind of description is not satisfactory with respect to a dynamic actions, a white paper is not a way).
  • More simple and interactive way of operations on images.
Not all the aims are included in current versions, but they are planned.

Inspirations

I started work with long time experiences of using ds9 and Gaia. Both are my personal favorites. Unfortunately, both does not fully supports my needs and also the reasons above. Initially, I started work on GUI as a modernized copy of ds9. Lately, I abandoned the way but many of features introduced by the tools are presented.

As a first step, I coded Help window as a simple copy of Gnome's Epiphany browser which can be used to display on-line help. Epiphany was one from first clear designed web browsers.

A help (large)

The Browser window with thumbnails is similar to another Gnome's utility - Nautilus, the file browser.

A browser (large)

The dynamical behavior during image rendering/loading is similar to iPhoto. The load of large images can take many seconds and user is confused when see only a blinded menus or controls. So I used the hook: Firstly display appropriately scaled icon and display a rendered image later. This gives effect of "working application" without confusing of its user.

A view during loading (large)

The design

I designed GUI practically randomly in early phases. A lot of features I'd take from ds9 or Gaia. After some period of frequent redesigning, I found the document Apple Human Interface Guidelines and afterward I cleared concept. By my opinion, the chapter Characteristics of Great Software is on base of work and ideas of Jef Raskin.

The key principles:
  • Keep everything simple as possible.
  • Don't distract the user from his/her ideas which working on.
  • The user is not a programmer. It is expert in astronomy and I is not (not want to be) an expert in programming.
See J.Raskin's rules.

In astronomical processing, the user thinking in absolutely different working framework or technical terms (for example: spectral distribution, black holes, fluxes) than a programmer. So I tried design the GUI in that way. For example, a FITS file is displayed as a tree structure similar with single parts representing every HDUs. User can easy show info about image as well as image itself.

Don't expect that the GUI will cover all functionality of command line utilities.

The programming language selection

The code of Munipack has been developed in Fortran up today. I used Fortran 77 in early days, but now is everything except munimatch coded in Fortran 90. Unfortunately, there is no graphical toolkit for Fortran.

I selected C++ after forethought. The C only is too primitive and too difficult for GUI programming. The Pascal or Ada has poor support of libraries. Go looks attractively (is like C++ with garbage collector) but is not matured yet and also has no support for graphical toolkit.

I abandoned Java, Python, Ruby, etc. for theirs very slow run and needs of a virtual machine. Also, I can not hustle users for installing some additional exotic virtual machines when they have running CPUs. By my opinion, it is very bad idea to write a image processing utility in interpreted language. The counterargument that the development is more fast is fake. I spend approximate same time while developing in Python and C++ under wxWidgets.

Also, theirs run-time speed is important. All interpreters are slow, really slow. The pull of menu is a great action spending many seconds.:) The replace of sub-elements in a window induces a pause for cup of coffee.:) But users have no time. User wants to see their results immediately. The one is less important in GUI elements but it is really important while displaying of large images. The image must be recomputed (rendered) before every displaying.

The operation must be done on CPU because I have not found a way how to done it onto a graphical card (if there is the way, I think OpenGL specification doesn't supports my needs). To illustrate the computations , please try code with loop over 1000x1000 pixels with a simple operation like sum of two real numbers in various languages. I got: C++ under 0.1 sec, Java, Python a few seconds. Because the operations are done very frequently the interpreted languages are unusable for me.

The toolkit

Alternatives for graphical toolkit:
  • Gtk+ - With Gtkmm+ may be useful. But relative rapid development and non-platform looks doesn't fit my needs.
  • Qt - controversial license. Development depends on one company. Unaesthetic look.
  • GnuStep - Very attractive alternative witch offers look, control and behavior like Mac OS X. Moreover it is programmed in Objective-C. Unfortunately, graphical look under Linux is like one from the past.
All others are unmatured and too simple (Fltk, fox, ..) or they are in too many retro-style (Motif, Athena ..).

Finally, I selected wxWidgets. The toolkit offers many amazing features.
Two crucial features for me:
  • Portable (multiplatform). It can be run under many possible environments including Unix, Mac OS X, Windows, smart phones.. The library is as the unified layer between portable code (developed by my) and a native toolkit. Impressive! So it looks as a native application. Many additional features like portable configuration support, filesystem managements and many more helpers are implemented.
  • Memory management. wxWidgets has two approaches to memory management. Graphical widgets (like windows, controls, dialog.. elements..) are created as pointers and destroyed by the library itself so memory management is done by library itself. Second way is implementation of reference-counting into library core. Therefore all my data I'm using as reference (not pointers) without risk of memory leaks. Both techniques practically does a kind of automatic garbage collector. The memory is checked via valgrind.
The GUI core

I spend a large working time with:
  • design and implementation of FITS representation on the top of cFITSIO. I developed a set of C++ classes as representation of a general FITS file. The implementation can be used as a simple library for cFITSIO. CCFits library does not fit my needs.
  • I'm using two sets of classes. First represents a original FITS file and the structure is fully used in View window. Second represents a FITS meta object containing only headers and icons without memory consuming bitmaps or tables. The structure is used in Browser window.
  • GUI design (see rest of the post).
  • Any interaction between different windows controls is implemented by all modern window systems via events. Unfortunately, the implementation is very primitive. Single object sends events to another specified object but it is very difficult to sent the signal to various objects, gets a response and vice versa. The basic idiom used in this context is model view-control. I'm using some ideas of its for displaying of images. But I developed it by little another way and independently.
  • Using of threads. One from big problems when I solved. The windowing system runs handlers for prepared actions without interruption. If any action takes longer periods (over 0.7 sec) user can see that the application hangups. This is very confusing behavior. The right way, how to solve, is via independent threads. The application responses very quickly and it pleasure for use when threads are used. It is very nice when more than one CPU (or core) is installed. Than both threads can run on different CPUs to importantly speed up the application.

During hungup (large)

The GUI design

I choose untypical (with respect to ds9 or Gaia) layout as result on many experiments. The View window is divided onto two pars. The left is a control panel where a structure and additional info (histogram, size) of FITS is displayed and the right is for displaying of a contents. The window can show all images, tables and header.

The geometry is ideal for modern monitors (commonly in 16:9 format, Xerox Alto computers are very rare to meet today) when a maximal area of image can be displayed without compromises. The left part is near of other controls (menu or toolbar) to meet a right ergonomy (my personal preference is the close button on the left).

Two important improvements over the ds9 and Gaia are included:
  • The image is fitted on the size during load (like iPhoto's). This speed up loading of large images and also the image is completely and perfectly prepared for a quick inspection.
  • The image is automatically scaled in intensity. The user probably will see mostly details on the image.
Some controversy can be creating of separated windows for tuning and detail view. Many peoples don't like the behavior. I checked both alternatives, the dialogs included in tabs in control area and the separated windows. Perhaps, the including is too restricting. The windows has limited space so you can't freely move or size windows. The layout and labels must fit the space. You can't have opened both windows together etc. A useful discussion can clarify the layout.

A detail (large)

Bugs

The current version is relative stable but it was not tested on wide range of software configurations or operating systems. Therefore I expecting a lot of bugs to fix.

The are two channels to report any bugs:
  • The issue tracker under Google code. You must have OpenID account to use it. You can also use the tracker to request of new features.
  • When application crashes, the application shows an dialog with a log which can be used to directly report to my ftp server. The log can contain some private information. Please check directly that the info can be read be me and when you will accept to pass a (really) small info about your configuration, please send the report.

A crash report

You are also welcomed to Munipack's Google groups where you can report bugs or discuss more detaily other aspect of Munipack and the astronomical processing.

Discover of a New World

The detailed structure of FITS files reveals a new world for me. With help of the one, many FITSes shows unexpected additional features. Do you know that DSS images has two parts? The use of the parts is trivial not a complicated. The structure of FITS is displayed as an integral part of FITS world so users are not surprised by the complicated structure. Especially, it is important during inspection of any unknown file.

Horse nebula (large)

Unimplemented features
  • List mode (don't show icons, show table with files) in Browser. It is probably useless, because thumbnails are not visible. I'm hesitating with its support.
  • Orthogonal intensity scaling. The well behaving applications has orthogonal controls. That means, that when one tune a quantity, other quantities are not affected. Unfortunately, the intensity scaling in not implemented orthogonally. The image is brighter of darker by using of both sliders. I have no idea how orthogonalise it.
Please, keep in mind that everything can be changed.

2010-01-15

Astrometry Calibration

I added experimental support for astrometry to Munipack. It is primary designed for astrometry on CCD. So only gnomonical projection and affine transformation (shift and rotation) are supported yet. Both limitation can be freed in future releases.

The astrometry code is available in GoogleCode now.

The algorithm

I coded my own implementation of the widely-known "Astrography plate
measurement" (Smart & Green, Tagg).

On input:
The table with Right Ascension,Declination (from any catalogue) and corresponding measured rectangular x and y coordinates for every object.

The algorithm:
  1. Get center of a picture in pixels as half of both sizes.
  2. Estimate a center of projection in spherical coordinates by mean.
  3. Project spherical coordinates to rectangular ones.
  4. Estimate scale as mean ratio between distances of projected and measured distances of objects.
  5. Estimate angle of rotation between projected and measured coordinates by using of property of scalar product of vectors.
  6. Compute the transformation by some minimization method with starting values given by items 3,4 and 5.
  7. Check offsets between center of projected and measured coordinates. If one is too great, does better estimate of the center of projection and repeat from item 2, else finish.
On output:
Precise coordinates of center of projection, scale and position angle of image. Optionally, statistical parameters.

The described algorithm is relative general, so it can be easy modifies for another projections or more complex transformations of rectangular coordinates like distortions, pincushion, etc.

The algorithm is iterative due to fact that we need known the center of projection before we have calibrated image. Usually, only tree iterations are required.

The Robust Algorithm

The real-world usable algorithm needs to be robust. Small deviations can cause only small variances in output parameters. Outliners has only small fluency on the solution. The simple example of the use of robust method can be found in Launer & Wilkinson: Robustness in statistics: proceedings of a workshop (1979) or in Numerical Recipes. Munipack implements robust mean estimator in lib/statistics.f90 as rmean. Another use of the robust algorithm is on straight light fitting.

I applied the basic schema of robust algorithms to the above algorithm:
  1. Estimate median of absolute deviations (MAD) using of minimization of absolute deviation (Nelder-Mead).
  2. Use method of maximal likelihood to estimate the proper transformation.
Astrometry implements module in astrometry/astrometry.f90. The estimation of MAD is loop around nelmin. The maximum likelihood is loop around Minpack's hybrd.

The fitting is decomposed on tree single steps (estimate of center of projection, scale and affine transformation) so we works in parameter sub-spaces (we are not fitting all the parameters together). Numerical experiments gives me more robust behavior and more precise solutions than simultaneous fitting of all parameters. By my opinion, it is results of (non-)ignoring of their cross-correlations. It is interesting that QR decomposition does not solve the problem.

The algorithm is optimized on precision not on speed. So for a lot of objects may be slow.

The Low Precision Algorithm

The robust algorithm can be used only for many object (ideally for tens of stars and more). The absolute minimum is five stars. I'm supposing that sometimes will be required astrometry for less stars. In this case, I'm switching to a simple algorithm which uses median only to estimate of required parameters.

For two stars, the transformation can be determined, but is is not possible to estimate uncertainties of parameters. It is not possible to estimate scale or rotation for only single star.

Input/Output "protocols"

I choose an unusual way to set an input and get an output data. The stream input is a text file consists a set of lines (records) with the same structure as FITS headers. By the user point of view, the routine performs as a transformation filter. The text file on input is modified by replacing and adding new lines to the output text file.

The output structure is perfectly suitable to be directly writable to a FITS header in standard WCS format. We need an external utility to merge of the output lines to an existing FITS file. Munipack provides munifits tool for the task.

It is possible to add another type of output, but I think, there is no other widely used standard astrometric format. If the way will successful, I'll reimplement it for remaining utilities of Munipack.

The style can be little bit strange but absolutely the same as standard e-mail handling. An user interface (Thunderbird, elm, ..) creates well formatted input file and pass it to a SMTP daemon. The daemon (or its successors) delivers its (as copying filter) to a recipient and some client decode its. Note that many Unix utilities uses the same idioms (sed, awk,...). The main advantage of the style is a simple modification of format, also back and forward compatibility. [http://catb.org/~esr/writings/taoup/].

The protocol is in detail described in included manual page and test example.

An example of calibrated image with UCAC2 displayed stars. Thx. Mr. Pavel. Large size.

2010-01-05

Fitting of Straight Line

One from most trivial problems of statistical regression analysis is fitting of a straight line. I selected this well-known problem to illustrate
All required code can be found in the archive. Please read README for detailed description of included files.

Reference Data and Solution

As data for a working example, I selected a tabulated values for a straight line from excellent mathematical handbook: Survey of Applicable Mathematics by K. Rektorys et al. (ISBN 0-7923-0681-3, Kluwer Academic Publishers, 1994). The data set is included in the archive as line.dat.

Normal equations:
 14*a +  125*b = 170
125*a + 1309*b = 1148.78
and LS solution:
a   = 29.223
b = -1.913
S0 = 82.6997
rms = 2.625
sa = 1.827
sb = 0.189





Using Minpack

Simple usage of Minpack in LS case is straightforward. One calls hybrd (Jacobian is approximated by numerical differences) or hybrj (must specify second derivatives) and one pass a subroutine to compute vector of residuals in a Minpack required point. Minpack uses Powell's method which combines location of minimum with conjugate gradient method (locate minimum in direction of most steeper slope) far from minimum and Newton's method (fit the function with multidimensional paraboloid and locate minimum by intersection of tangent plane with coordinate axis) near of minimum.

For the straight line, we define

a + b * xi

and minimizing of sum for i = 1 to N:

S = ∑ (a + b * xi - yi

The vector for Minpack is

∂S/∂a = ∑ (a + b * xi - yi)
∂S/∂b = ∑ (a + b * xi - yi)*xi.

The Jacobian is than
∂²S/∂a²    ∂²S/∂a∂b
∂²S/∂b∂a ∂²S/∂b²
or
N          ∑ xi
∑ xi ∑ xi²
All the sums can be found in minfunj in straightlinej.f90.

The call of hybrj search for a minimum of the function. On output, the located minimum is included in fvec and I added a code to compute covariance matrix to estimate statistical deviations of parameters and their correlation.

With gfortran I get the solution on 64bit machine:

....
minfun: par =   29.22344    -1.91302  sum =  82.68976
minfun: par = 29.22344 -1.91302 sum = 82.68976
hybrj finished with code 1
Exit for:
algorithm estimates that the relative error between x and the solution is at most tol.
qr factorized jacobian in minimum:
q:

0.10769513068469699 0.99418396628934158
-0.99418396628934158 0.10769513068469688

r:
152.97596122677001 1371.8039727109779
1371.8039727109779 17.656368881644468

inverse of r (cov):

0.25799238244686612 -2.87651125847342495E-002
-2.87651125847342495E-002 3.20772561895261684E-003

covariance:

1.7777772679919355 -0.19821501231688973
-0.19821501231688973 2.21038374592466315E-002

No. of data = 14
solution = 29.223435764531654 -1.9130248056275454
with dev = 1.3333331421636287 0.14867359368511487
residual sum = 82.689756238430220
rms = 2.6250358130641160
The results must correspond (within precision of tree digits) to the reference solution. As we can see, there is a great discrepancy in deviations of parameters. The Minpack's estimation is little bit optimistic. I think that is due to difference between matrix inversion (which is usually used) and Minpack's covariance estimation. On the other side, the values are the same from practical point of view.

Just for information. The inverse matrix (all by Octave) of Jacobian in minimum is (inv(.))
 0.4846353  -0.0462792
-0.0462792 0.0051833
and the QR factorization ([q,r,.]=qr(.)):
q =
-0.995472 0.095060
-0.095060 -0.995472

r =
-2.0541e+00 -1.2576e+02
0.0000e+00 -1.3150e+03
The Q matrix columns are base vectors (eigenvectors) of solution (the principal axes of covariance ellipsoid) and the diagonal elements are estimates of eigenvalues values (major and minor semiaxes of the ellipse) [l,v]=eig(.).

l =
-0.995457 0.095208
0.095208 0.995457

v =
2.0447e+00 0.0000e+00
0.0000e+00 1.3210e+03

The second supplied routine straightline.f90 does the same work but without explicit knowledge of the second derivatives. The Jacobian is estimated by numerical differences.

The solution can be also done via lmdef, lmder routines in Minpack. It is equivalent to presented solution but doesn't offers generalization toward robust methods.

Robust Fitting

The reference robust fitting procedure is included in rstraightline.f90.

The fitting is logically divided onto two parts. The first part implements minimizing of sum of absolute deviations to get robust estimation of proper solution and MAD (mean of absolute deviations). There is little change with respect on LS because minimizing function have no derivation in minimum. We need another method without using of derivatives. I'm using code prepared by John Burkardt, namely using Nelder-Mead Minimization Algorithm (simplex method). I slightly rearranged the code to nelmin.f90.

The resultant parameters are used to obtain MAD by looking for its median by a quick way algorithm described in Niklaus Wirth's Algorithms + Data Structures = Programs.

The solution is than passed as start point for hybrd which is the second part. The minfun is similar to non-robust version. Only difference between predicted and computed solution (residual) is not directly used, but a cut-off function is used (Tukey's function). This small change does robust fitting itself.
.....
medfun: par= 30.57930 -1.918842 sum= 29.9467137
medfun: par= 30.57930 -1.920842 sum= 29.9506577
ifault= 0 29.940157123526994
t= 30.579306941153348 -1.9198428764730142
4.3939266204833984 2.9615066
minfun: par = 30.57931 -1.91984 sum = 106.17691
.....

minfun: par = 29.24548 -1.91425 sum = 82.69177
hybrd finished with code: 1
Exit for:
algorithm estimates that the relative error between x and the solution is at most tol.
qr factorized jacobian in minimum:
q:

-0.10942034931424138 -0.99399556696996860
0.99399556696996860 -0.10942034931424133

r:

29.315789045435952 295.17538616058539
295.17538616058539 4.3667770003656630

inverse:

5.3177769129754946 -0.52802747846866527
-0.52802747846866527 5.24418460845494372E-002

covariance:

2.7756865626932967 -0.27561118127052436
-0.27561118127052436 2.73727405045027551E-002

No. of data = 14
solution = 29.245477844988198 -1.9142493591979692
with dev = 1.6660391840209812 0.16544709276533920
residual sum = 82.691772460937500
rms = 2.6250678159642766
The output values are practically the same as in non-robust case. Only the difference is estimation of parameter's deviation. I'm using the formula recommended by Hubber (1980), eq. (6.6) p. 173.

The real power of the robust fitting can be easy demonstrated by adding any outlier (point with really different value) to the set, for example, a point with coordinate 10,100. Try to see the robust algorithm in action. It should be practically the same while non-robust solution gives some strange values.

Minpack Fortran Interface

The original Minpack is written in Fortran 77. I'm using modern Fortran (Fortran 90, 95 or 2003) which supports better type checking via interfaces. I prepared such interface which is included in Archive as minpack.f90 and must be passed to compiler during compilation. The module did not changed original API to Minpack routines to prevent any programming errors. So you also must pass to the routines "working arrays" (wa). One is used in modern Fortran more elegant way as automatic arrays (arrays allocated automatically when subroutine is entered and deallocatedon its exit).

Estimation of an initial solution

The solution will not depend on starting point only in linear case. Every complex real) case will lead to non-linear solution with a lot of local minimums which will attract the simplex or the gradient (Powell's method) to a "wrong" solution. To locate global minimum (eg. required solution), I recommends use of genetic algorithms as predictors of a global minimum. The genetic algorithms will locate right minimum with a low precision and we can use some modification of above codes to determine the minimum with required precision.