Ticket #9782 (inprogress)

Opened 6 years ago

Last modified 5 years ago

Add function that uses automatic differentiation to calculate derivatives

Reported by: Michael Wedel Owned by: Michael Wedel
Priority: major Milestone: Release 3.5
Component: Framework Keywords:
Cc: Blocked By:
Blocking: Tester:

Description

As described in the following document:

https://github.com/mantidproject/documents/blob/master/Design/Autodiff.md

I would like to have a function interface that uses automatic differentiation instead of numerical differentiation, utilizing "adept":

http://www.met.rdg.ac.uk/clouds/adept/

Since this new function requires adept as a third party library (and two or three very small source code changes to one of the files in order to make it compile in windows), I would appreciate some advice on the best way to integrate this library.

I will use this branch to keep track of my local work, the code will not compile before the "library integration issue" is solved.

If you would like to try anyway, I can explain how to build and install adept (a target to make a shared object file has to be added to the makefile, it only builds a static library by default) on Linux and/or supply the changes required to build on windows.

Attachments

WorkspaceData.nxs (125.2 KB) - added by Michael Wedel 6 years ago.
Example data with 20 Gaussian peaks, 5001 data points.
6904_peaks_table_fortran.nxs (37.5 KB) - added by Michael Wedel 6 years ago.
6904_calc_peaks1-4_fortran.nxs (1.7 MB) - added by Michael Wedel 6 years ago.
Gaussian_y_J_053.nxs (97.8 KB) - added by Michael Wedel 6 years ago.
Lorentzian_y_J_d.nxs (72.9 KB) - added by Michael Wedel 6 years ago.
PearsonVII_y_J_d.nxs (79.0 KB) - added by Michael Wedel 6 years ago.
Gaussian_y_J_d.nxs (67.4 KB) - added by Michael Wedel 6 years ago.

Change History

comment:1 Changed 6 years ago by Michael Wedel

Re #9782. First rough draft

Changeset: f3703ba7b308265143d77318c18e2b1e25dd630c

comment:2 Changed 6 years ago by Michael Wedel

Re #9782. Refined first draft, added tests

Changeset: 9ca66ea8a89b819402a71e05869541bf87e408c9

Changed 6 years ago by Michael Wedel

Example data with 20 Gaussian peaks, 5001 data points.

comment:3 Changed 6 years ago by Michael Wedel

Re #9782. Temporarily adding adept to MantidAPI CMakeLists.txt

Changeset: 05938a8cf7d5cf934cfc08636aa8fbedea641259

comment:4 Changed 6 years ago by Michael Wedel

The testing algorithm can be used with the example data (it's just "proof of concept", so it only really does this one thing) in this way:

Load(Filename='WorkspaceData.nxs', OutputWorkspace='WorkspaceData')
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace='FitParameters', OutputWorkspacePlot='Plot')

Note that I will remove the testing algorithm and function before the ticket is finished.

comment:5 Changed 6 years ago by Michael Wedel

Re #9782. Cleaning up and documentation in IFunction1DAutoDiff

Added doxygen comments and some general documentation. Cleaned up the interface of AutoDiffJacobianAdapter and added error handling.

Changeset: a94a28bff823fa5258bc30d282ef7184cb1cb254

comment:6 Changed 6 years ago by Michael Wedel

Re #9782. Fixing incorrect operator in AutoDiffJacobianAdapter

Changeset: 9b6668c0f31f4d5b939f84b61980d12f507b6aa1

comment:7 Changed 6 years ago by Michael Wedel

From my point of view this ticket is more or less done except for the unsolved question of how to integrate adept itself into Mantid.

Last edited 6 years ago by Michael Wedel (previous) (diff)

comment:8 Changed 6 years ago by Michael Wedel

To sum up my packaging and cross-platform adventures with adept.

Linux

  • Modified the makefiles of the original source archive to build an .so file instead of an .a file, so it can be linked dynamically. I thought this dynamic linking approach would be reasonable for Linux systems.
  • Created .deb and .rpm packages. I'm not doing this too often, so I don't know whether the packages are okay the way I did it. One thing that could possibly be improved is that the .rpm package could be split up into adept and adept-dev, just as the debian package. On the other hand, there are actually only two file that need to be distributed (libadept.so and adept.h).
  • Wrote a small test program and compiled it with -ladept after installing the created package on a clean system and make sure it produces the correct output.

Windows

  • Found out that dynamic linking is not possible here, because __declspec(thread) and __declspec(dllexport) are not compatible, so on Windows there is (probably?) no alternative to static linking.
  • Patched adept.h to use correct identifiers (__declspec(thread) instead of __thread, __restrict instead of __restrict__).
  • Created a VS solution that builds adept as a static library (adept.lib)
  • Created a VS solution that builds the above mentioned example program, linking adept.lib statically and checked the output.

Mac OS X

  • Could not test on Mac OS X, since I have no machine with the proper environment around.

comment:9 Changed 6 years ago by Nick Draper

  • Status changed from new to assigned

comment:10 Changed 6 years ago by Michael Wedel

Another update regarding the build system.

Windows

  • Managed to build a DLL, sacrificing the possibility to make a thread safe build. This is not a problem either OpenMP is disabled or when the body of IFunction1DAutoDiff::functionWrapper is put into a PARALLEL_CRITICAL block (tested with a small example program).

General

  • For easier cross-platform builds I decided to create a CMake based build system for the library and the included tests (only those that don't require other libraries, e.g. benchmarks).
  • The CMake system works in Linux and Windows with Visual Studio.
  • I created a Findadept.cmake file for easy inclusion into the Mantid build system and tested that it actually finds the library (at least on Linux, it does).
  • Since these changes added up, I decided for now to have this as a fork of the library. The library itself is only two files, and by putting this into a github repository, the original author can maybe profit from the changes (already contacted him about this).

What's left to do

  • Verify that it builds on Mac OS X. I tried building with old gcc versions agains old standard libraries, this worked okay, so I hope it should be okay on Mac OS X.
  • Put readme-files, docs and other additional stuff into the fork repo.
  • Adjust packaging scripts (should be much easier now, as there are no patches required in the packaging process itself).
Last edited 6 years ago by Michael Wedel (previous) (diff)

comment:11 Changed 6 years ago by Michael Wedel

  • Status changed from assigned to inprogress

Refs #9782. Added CMake find module for adept

Changeset: 244715a93a6e65f02e8d425dcd3831507fec34b0

comment:12 Changed 6 years ago by Michael Wedel

Refs #9782. Added PARALLEL_CRITICAL section.

Changeset: 62cc1e6d344edb95acdd4c49b558f713b4962062

comment:13 Changed 6 years ago by Michael Wedel

Packaging scripts are adjusted as well. I built packages for Ubuntu 13.10 and Fedora 13 (Don't have RHEL6 available, so I chose an old version - if it builds there, it should work on newer versions as well). CMake build system works on Windows as well.

Only Mac OS X left now.

comment:14 Changed 6 years ago by Michael Wedel

  • Blocking 9991 added

comment:15 Changed 6 years ago by Michael Wedel

  • Blocking 9992 added

comment:16 Changed 6 years ago by Michael Wedel

Refs #9782. Added Gaussian function with numerical derivatives

The same as GaussianAutoDiff, for comparison.

Changeset: 1fbcda1946c6817c5c7071eb2e907bf54c110c72

comment:17 Changed 6 years ago by Michael Wedel

Refs #9782. Comparison capability on AutoDiffTestAlg

Roman's changes to make a comparison possible between numerical and auto diff.

Changeset: 80544917d0c9a0233a9443804e1a7a3c15e44c56

comment:18 Changed 6 years ago by Michael Wedel

Refs #9782. Modifying fit to output number of iterations.

Changeset: d15ce0559fc1de45452559ad45823640b0d39ade

Changed 6 years ago by Michael Wedel

Changed 6 years ago by Michael Wedel

comment:19 Changed 6 years ago by Michael Wedel

Refs #9782. Added gaussian with hand-coded derivatives

Changeset: 1538ab6ea25a15dc266462d999451d094447ebae

Changed 6 years ago by Michael Wedel

comment:20 Changed 6 years ago by Michael Wedel

Refs #9782. Removed hard-coded number from AutoDiffTestAlg

Changeset: 0acd29456afde9e9439171692ec67e3c1f02e590

comment:21 Changed 6 years ago by Michael Wedel

Update

Performance tests conducted by Roman and me showed that (in contrast to my expectation), automatic differentiation is slightly slower for the cases we tested, even with a fair amount of parameters and spectra to fit. I also tested different approaches regarding the integration of adept into the function fitting framework, but none of that lead to significant speedups. Compiling adept with a larger initial stack size helped a bit, since it does not need to grow the stack so often anymore (it involves re-allocation of memory and is rather expensive).

Another argument generally made in favor of automatic differentiation is the better precision of the resulting derivatives, so I wanted to test this as well.

I modified AutoDiffTestAlg to also include a function with hand-coded derivatives and make it more suitable for systematic testing.

The algorithm expects a MatrixWorkspace with 4 spectra, which contain the following data:

  1. f(x)
  2. df/d(centre)
  3. df/d(height)
  4. df/d(sigma)

It produces an equivalent workspace with the differences (reference - calculated) from calls to IFunction::function and IFunction::functionDeriv, using the specified method (num, adept, handcoded).

In attachment:Gaussian_y_J_053.nxs there are data for a Gaussian (Centre: 0.0, Height: 4.0, Sigma: 3.5) on the range [-5,5] (Produced with Matlab symbolic toolbox). Assuming the file is located in the data search path, the following script generates workspaces for all three methods:

Load(Filename='Gaussian_y_J_053.nxs', OutputWorkspace='WorkspaceData', LoaderName='LoadNexusProcessed', LoaderVersion=1)
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="HandCoded",DerivativeType="handcoded")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="Numerical",DerivativeType="num")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="AutoDiff",DerivativeType="adept")

For hand-coded and automatic derivatives the differences are all in the range [-5e-16, 5e-16], but not identical. I suspect differences on that level are not significant taking into account floating point precision (the values are on the order of 1e0 or 1e-1, with 15-17 significant digits I would classify these differences as "tolerable").

In the numerical case all three derivatives behave differently. For df/d(centre) the differences to the correct value are largest with values on the interval [-0.03, 0.03], where the derivatives themselves are on a scale of ca. [-0.8, 0.8]. df/d(height) looks reasonable, with differences of [-5e-14, 5e-14] with a distribution that seems more or less random. The derivations of df/ds are on the interval of [0, 7e-4], with values [0, 0.9]. Here the differences look very systematic, the difference curve looks very similar to the derivative, just 3 orders of magnitudes smaller.

With the script above and the attached data these results should be reproducible.

Overall, the hand-coded and automatic derivatives are comparable and do not show significant derivations from the reference results. Numeric derivatives show differences which can become large in some cases. I know that this is expected, because of truncation and rounding of floating point numbers and depends on many factors such as parameter value range, function value range and possibly other factors that I am forgetting now. I tried to choose "reasonable" example parameter ranges, but I am willing to try some more functions/parameter combinations as well.

comment:22 Changed 6 years ago by Michael Wedel

Refs #9782. Changed AutoDiffTestAlg

The algorithm now takes function parameters (for calculating values) and measures performance of derivative calculation.

Changeset: e1440dbf36023a7af3f042428a05d7ed78b03be8

comment:23 Changed 6 years ago by Michael Wedel

Refs #9782. Changed implementation of Gaussians to not use pow

I was not aware how incredibly slow pow() is for squaring a value.

Changeset: 5e7de3ecd7ac1a7009ed0c9e9abb14c6675d1703

comment:24 Changed 6 years ago by Michael Wedel

I decided to spend some more time on profiling, because it seemed odd that hand-coded derivatives would sometimes be slower than numerical and discovered that using "pow" for squaring is not a good idea, especially with automatic differentiation. This is why numerical derivatives were getting away quite cheaply - the derivatives only called "pow" once, and only the double version, as opposed to the adept-version. Now the relative performance of calculating derivatives directly through IFunction::functionDeriv() is ca. 1 : 2.5 : 2.8 (hand-coded, adept, numerical).

Changed 6 years ago by Michael Wedel

comment:25 Changed 6 years ago by Michael Wedel

Refs #9782. Added Lorentz-profiles

They all live in a namespace called "Lorentzians".

Changeset: a20c5fc6e651e9acb5920eea67415927f5f2fb79

comment:26 Changed 6 years ago by Michael Wedel

Refs #9782. AutoDiffTestAlg uses Lorentzians instead of Gaussians

Changeset: 8b0f0e58850ecce1e8972f73dafca4b8bb1b6703

comment:27 Changed 6 years ago by Michael Wedel

The Lorentzian is an interesting test-case, because there isn't even a call to exp, it's just basic arithmetic operations. In this case the relative performance is 1 : 1.94 : 6.97, so roughly 1 : 2 : 7 (hand-coded, numerical, adept). It's not what I expected though (what I had written previously was caused by wrong ordering of my output...), so I will investigate further

Last edited 6 years ago by Michael Wedel (previous) (diff)

comment:28 Changed 6 years ago by Michael Wedel

Refs #9782. AutoDiffTestAlg improved

Changeset: f6207463aec71b57903d12ee3a212c65c884d1d3

comment:29 Changed 6 years ago by Michael Wedel

Refs #9782. Adding necessary compile time defines

Changeset: bb5616b1926bcae24bfe84342e958b90eb6ab700

comment:30 Changed 6 years ago by Michael Wedel

Refs #9782. Const string reference

Changeset: d96c28279914ec75c7aac93500149f96dfcc4133

comment:31 Changed 6 years ago by Michael Wedel

Refs #9782. Added Pearson-VII, changed AutoDiffTestAlg

Changeset: 3b6f16be424f2c25563f8e69d4e1a930c37a8174

Changed 6 years ago by Michael Wedel

comment:32 Changed 6 years ago by Michael Wedel

In the last commit I added the Pearson-VII function (as defined here http://pd.chem.ucl.ac.uk/pdnn/peaks/pvii.htm), which has 4 parameters, definitely requires the use of pow and has fairly complex derivatives.

With the attachment data, you can do:

Load(Filename='PearsonVII_y_J_053.nxs', OutputWorkspace='WorkspaceData', LoaderName='LoadNexusProcessed', LoaderVersion=1)
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="HandCoded",DerivativeType="handcoded", GaussianParameters="0.665 200.0 0.0007 2.0")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="Numerical",DerivativeType="num", GaussianParameters="0.665 200.0 0.0007 2.0")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="AutoDiff",DerivativeType="adept", GaussianParameters="0.665 200.0 0.0007 2.0")

The derivatives are not as precise as the ones for Gaussian and Lorentzian, not even the hand-coded ones. I guess it has something to do with the implementation of the mathematical functions that are used and in the end, also floating point accuracy.

This is a summary of the performance (on my Ubuntu machine). It's always Hand-coded : Numerical : Adept.

Gaussian

  • Function: 1 : 1 : 1.3446
  • Derivatives: 1 : 2.8005 : 1.9767
  • Fit (iterations): 6 : 4 : 6

Lorentzian

  • Function: 1 : 1 : 3.3559
  • Derivatives: 1 : 1.9318 : 5.2510
  • Fit (iterations): 7 : 6 : 7

Pearson-VII

  • Function: 1 : 1 : 1.0908
  • Derivatives: 1 : 2.2272 : 1.0737
  • Fit (iterations): 6 : 7 : 6

Based on these findings I think it's really worthwhile to include this feature into Mantid.

Changed 6 years ago by Michael Wedel

comment:33 Changed 6 years ago by Michael Wedel

Local checkpoint Refs #9782

Changeset: ad174be582f79906256262969171d0e2e56dc67b

comment:34 Changed 6 years ago by Michael Wedel

Refs #9782. Some cleanup and comments in AutoDiffTestAlg

Changeset: 7f7318465d5a5097e684865e56b580e510f459b3

comment:35 Changed 6 years ago by Michael Wedel

The main adept integration is in Mantid::API::IFunctionAutoDiff. Because of the way parameters for automatic differentiation work they can not be used as usual. There are examples how to implement concrete functions in Mantid::CurveFitting (GaussianAutoDiff, Lorentzians, Pearsons).

In Mantid::CurveFitting there is a test algorithm, AutoDiffTestAlg, which can be invoked by this script:

Load(Filename='/home/mwedel/Data/TestData/AutoDiff/Gaussian_y_J_d.nxs', OutputWorkspace='WorkspaceData', LoaderName='LoadNexusProcessed', LoaderVersion=1)
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="G_HandCoded", FunctionName="Gaussian",DerivativeType="analytical", FunctionParameters="0.665 200.0 0.0007")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="G_Numerical", FunctionName="Gaussian",DerivativeType="num", FunctionParameters="0.665 200.0 0.0007")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="G_AutoDiff", FunctionName="Gaussian",DerivativeType="adept", FunctionParameters="0.665 200.0 0.0007")

Load(Filename='/home/mwedel/Data/TestData/AutoDiff/Lorentzian_y_J_d.nxs', OutputWorkspace='WorkspaceData', LoaderName='LoadNexusProcessed', LoaderVersion=1)
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="L_HandCoded", FunctionName="Lorentzian",DerivativeType="analytical", FunctionParameters="0.665 200.0 0.0007")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="L_Numerical", FunctionName="Lorentzian",DerivativeType="num", FunctionParameters="0.665 200.0 0.0007")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="L_AutoDiff", FunctionName="Lorentzian",DerivativeType="adept", FunctionParameters="0.665 200.0 0.0007")

Load(Filename='/home/mwedel/Data/TestData/AutoDiff/PearsonVII_y_J_d.nxs', OutputWorkspace='WorkspaceData', LoaderName='LoadNexusProcessed', LoaderVersion=1)
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="P_HandCoded", FunctionName="Pearson",DerivativeType="analytical", FunctionParameters="0.665 200.0 0.0007 2.0")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="P_Numerical", FunctionName="Pearson",DerivativeType="num", FunctionParameters="0.665 200.0 0.0007 2.0")
AutoDiffTestAlg(InputWorkspace='WorkspaceData', OutputWorkspace="P_AutoDiff", FunctionName="Pearson",DerivativeType="adept", FunctionParameters="0.665 200.0 0.0007 2.0")

It records timing information and generates output with accuracy information for the partial derivatives. To use it you need to download the attached data files.

Last edited 6 years ago by Michael Wedel (previous) (diff)

comment:36 Changed 6 years ago by Michael Wedel

  • Milestone changed from Release 3.3 to Release 3.4

Unfortunately I have to postpone this one to Release 3.4

comment:37 Changed 6 years ago by Michael Wedel

  • Blocking 9991 removed

comment:38 Changed 6 years ago by Michael Wedel

  • Milestone changed from Release 3.4 to Release 3.5

comment:39 Changed 6 years ago by Michael Wedel

  • Blocking 9992 removed

comment:40 Changed 5 years ago by Stuart Campbell

This ticket has been transferred to github issue 10624

Note: See TracTickets for help on using tickets.