diff --git a/CHANGELOG.md b/CHANGELOG.md
index a945de7..b7e1cb3 100644
--- a/CHANGELOG.md
+++ b/CHANGELOG.md
@@ -1,329 +1,333 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html) for final versions and [PEP440](https://www.python.org/dev/peps/pep-0440/) in case intermediate versions need to be released (e.g. development version `2.2.3.dev1` or release candidates `2.2.3rc1`), or individual commits are packaged.
## Unrelease
### Added
- Added `tamaas.utils.publications` to print a list of publications relevant to
the parts of Tamaas used in a script. Call at the end of a script for an
exhaustive list.
- Added `tamaas.mpi.gather` and `tamaas.mpi.scatter` to help handling of 2D data
in MPI contexts.
- Added `getBoundaryFields()/boundary_fields` function/property to `Model`.
- Added Python constructor to `Model`, which is an alias to
`ModelFactory.createModel`.
- Added convenience `tamaas.dumpers._helper.netCDFtoParaview` function to
convert model sequence to Paraview data file.
- Added dump of surface fields as `FieldData` in VTK files
- Added `scipy.sparse.linalg.LinearOperator` interface to `Westergaard`. A
`LinearOperator` can be created by calling
`scipy.sparse.linalg.aslinearoperator` on the operator object.
- Allowed sub-classing of `ContactSolver` in Python.
- Exposed elastic contact functionals in Python to help with custom solver
implementation.
- Added an example of custom solver with penalty method using Scipy.
- Added `graphArea` function in statistics to compute area of function graph
- Added `tamaas.utils.radial_average` to radialy average a 2D field
+- Added option to not make a dump directory (e.g. `netcdf`, `hdf5` directories
+ for dump files). This allows to specify an absolute path for dump files, and
+ is useful if simulation workflow is managed with file targets, as do systems
+ like GNU Make and Snakemake.
## Changed
- Filter for boundary fields in dumper should be smarter.
- `read` functions in dumpers are now `@classmethod`
- Changed sign of plastic correction in KatoSaturated solver to match the
EPICSolver sign.
- Plastic correction of KatoSaturated is registered in the model as
`KatoSaturated::residual_displacement`.
- Changed to modern Python packaging (PEP517) with `setup.cfg` and
`pyproject.toml`. A `setup.py` script is still provided for editable
installations, see [gh#7953](https://github.com/pypa/pip/issues/7953). Once
setuptools has stable support for
[PEP621](https://peps.python.org/pep-0621/), `setup.cfg` will be merged into
`pyproject.toml`. Note that source distributions that can be built with
[`build`](https://pypi.org/project/build/) are not true source distributions,
since they contain the compiled Python module of Tamaas.
- Updated copyright to reflect Tamaas' development history since leaving EPFL
## Fixed
- Fixed `tamaas.dumpers.NetCDFDumper` to dump in MPI context.
## Deprecated
- SCons versions < 3 and Python < 3.6 are explicitely no longer supported.
## v2.4.0 -- 2022-03-22
### Added
- Added a `tamaas.utils` module.
- Added `tamaas.utils.load_path` generator, which yields model objects for a
sequence of applied loads.
- Added `tamaas.utils.seeded_surfaces` generator, which yields surfaces for a
sequence of random seeds.
- Added `tamaas.utils.hertz_surface` which generates a parabolic (Hertzian)
surface.
- Added `tamaas.utils.log_context` context manager which locally sets the log
level for Tamaas' logger.
- Added `tamaas.compute.from_voigt` to convert Voigt/Mendel fields to dense
tensor fields.
- Added a deep-copy function to `ModelFactory` to copy `Model` objects. Use
`model_copy = copy.deepcopy(model)` in Python to make a copy. Currently only
copies registered fields, same as dumpers/readers.
- Added a `read_sequence` method for dumpers to read model frames.
- Automatic draft release on Zenodo.
### Changed
- `*_ROOT` variables for vendored dependencies GoogleTest and pybind11 now
default to empty strings, so that the vendored trees in `third-party/` are not
selected by default. This is so system packages are given priority. Vendored
dependency submodules will eventually be deprecated.
### Fixed
- Fixed an issue with `scons -c` when GTest was missing.
## v2.3.1 -- 2021-11-08
### Added
- Now using `clang-format`, `clang-tidy` and `flake8` for linting
- Pre-built Tamaas images can be pulled with `docker pull
registry.gitlab.com/tamaas/tamaas`
- Added a `--version` flag to the `tamaas` command
### Changed
- The root `Dockerfile` now compiles Tamaas, so using Tamaas in Docker is easier.
- The model attributes dumped to Numpy files are now written in a JSON-formatted
string to avoid unsafe loading/unpickling of objects.
- Removed the `build_doc` build option: now the doc targets are automatically
added if the dependencies are met, and built if `scons doc` is called.
- Removed the `use_googletest` build option: if tests are built and gtest is
present, the corresponding tests will be built
### Fixed
- The command `tamaas plot` gracefully fails if Matplotlib is missing
- Better heuristic to guess which fields are defined on boundary in dumpers
(still not perfect and may give false negatives)
## v2.3.0 -- 2021-06-15
### Added
- Added `read()` method to dumpers to create a model from a dump file
- `getClusters()` can be called in MPI contact with partial contact maps
- Added a JSON encoder class for models and a JSON dumper
- CUDA compatibility is re-established, but has not been tested
- Docstrings in the Python bindings for many classes/methods
### Changed
- Tamaas version numbers are now managed by
[versioneer](https://github.com/python-versioneer/python-versioneer). This
means that Git tags prefixed with `v` (e.g. `v2.2.3`) carry meaning and
determine the version. When no tag is set, versioneer uses the last tag,
specifies the commit short hash and the distance to the last tag (e.g.
`2.2.2+33.ge314b0e`). This version string is used in the compiled library, the
`setup.py` script and the `__version__` variable in the python module.
- Tamaas migrated to [GitLab](https://gitlab.com/tamaas/tamaas)
- Continuous delivery has been implemented:
- the `master` branch will now automatically build and publish Python wheels
to `https://gitlab.com/api/v4/projects/19913787/packages/pypi/simple`. These
"nightly" builds can be installed with:
pip install \
--extra-index-url https://gitlab.com/api/v4/projects/19913787/packages/pypi/simple \
tamaas
- version tags pushed to `master` will automatically publish the wheels to
[PyPI](https://pypi.org/project/tamaas/)
### Deprecated
- The `finalize()` function is now deprecated, since it is automatically called
when the process terminates
- Python versions 3.5 and below are not supported anymore
### Fixed
- Fixed a host of dump read/write issues when model type was not `volume_*d`.
Dumper tests are now streamlined and systematic.
- Fixed a bug where `Model::solveDirichlet` would not compute correctly
- Fixed a bug where `Statistics::contact` would not normalize by the global
number of surface points
## v2.2.2 -- 2021-04-02
### Added
- Entry-point `tamaas` defines a grouped CLI for `examples/pipe_tools`. Try
executing `tamaas surface -h` from the command-line!
### Changed
- `CXXFLAGS` are now passed to the linker
- Added this changelog
- Using absolute paths for environmental variables when running `scons test`
- Reorganized documentation layout
- Gave the build system a facelift (docs are now generated directly with SCons
instead of a Makefile)
### Deprecated
- Python 2 support is discontinued. Version `v2.2.1` is the last PyPi build with
a Python 2 wheel.
- The scripts in `examples/pipe_tools` have been replaced by the `tamaas` command
### Fixed
- `UVWDumper` no longer imports `mpi4py` in sequential
- Compiling with different Thrust/FFTW backends
## v2.2.1 -- 2021-03-02
### Added
- Output registered fields and dumpers in `print(model)`
- Added `operator[]` to the C++ model class (for fields)
- Added `traction` and `displacement` properties to Python model bindings
- Added `operators` property to Python model bindings, which provides a
dict-like access to registered operators
- Added `shape` and `spectrum` to properties to Python surface generator
bindings
- Surface generator constructor accepts surface global shape as argument
- Choice of FFTW thread model
### Changed
- Tests use `/tmp` for temporary files
- Updated dependency versions (Thrust, Pybind11)
### Deprecated
- Most `get___()` and `set___()` in Python bindings have been deprecated. They
will generate a `DeprecationWarning`.
### Removed
- All legacy code
## v2.2.0 -- 2020-12-31
### Added
- More accurate function for computation of contact area
- Function to compute deviatoric of tensor fields
- MPI implementation
- Convenience `hdf5toVTK` function
- Readonly properties `shape`, `global_shape`, `boundary_shape` on model to give
shape information
### Changed
- Preprocessor defined macros are prefixed with `TAMAAS_`
- Moved `tamaas.to_voigt` to `tamaas.compute.to_voigt`
### Fixed
- Warning about deprecated constructors with recent GCC versions
- Wrong computation of grid strides
- Wrong computation of grid sizes in views
## v2.1.4 -- 2020-08-07
### Added
- Possibility to generate a static `libTamaas`
- C++ implementation of DFSANE solver
- Allowing compilation without OpenMP
### Changed
- NetCDF dumper writes frames to a single file
### Fixed
- Compatibility with SCons+Python 3
## v2.1.3 -- 2020-07-27
### Added
- Version number to `TamaasInfo`
### Changed
- Prepending root directory when generating archive
## v2.1.2 -- 2020-07-24
This release changes some core internals related to discrete Fourier transforms
for future MPI support.
### Added
- Caching `CXXFLAGS` in SCons build
- SCons shortcut to create code archive
- Test of the elastic-plastic contact solver
- Paraview data dumper (`.pvd` files)
- Compression for UVW dumper
- `__contains__` and `__iter__` Python bindings of model
- Warning message of possible overflow in Kelvin
### Changed
- Simplified `tamaas_info.cpp`, particularly the diff part
- Using a new class `FFTEngine` to manage discrete Fourier transforms. Plans are
re-used as much as possible with different data with the same shape. This is
in view of future MPI developments
- Redirecting I/O streams in solve functions so they can be used from Python
(e.g. in Jupyter notebooks)
- Calling `initialize()` and `finalize()` is no longer necessary
### Fixed
- Convergence issue with non-linear solvers
- Memory error in volume potentials
## v2.1.1 -- 2020-04-22
### Added
- SCons shortcut to run tests
### Fixed
- Correct `RPATH` for shared libraries
- Issues with SCons commands introduced in v2.1.0
- Tests with Python 2.7
## v2.1.0 -- 2020-04-17
### Added
- SCons shortcuts to build/install Tamaas and its components
- Selection of integration method for Kelvin operator
- Compilation option to remove the legacy part of Tamaas
- NetCDF dumper
### Fixed
- Link bug with clang
- NaNs in Kato saturated solver
## v2.0.0 -- 2019-11-11
First public release. Contains relatively mature elastic-plastic contact code.
diff --git a/python/tamaas/dumpers/_helper.py b/python/tamaas/dumpers/_helper.py
index 8a48ed4..06bb42e 100644
--- a/python/tamaas/dumpers/_helper.py
+++ b/python/tamaas/dumpers/_helper.py
@@ -1,159 +1,168 @@
# -*- mode:python; coding: utf-8 -*-
#
# Copyright (©) 2016-2022 EPFL (École Polytechnique Fédérale de Lausanne),
# Laboratory (LSMS - Laboratoire de Simulation en Mécanique des Solides)
# Copyright (©) 2020-2022 Lucas Frérot
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
"""
Helper functions for dumpers
"""
from os import PathLike
from functools import wraps
from pathlib import Path
import io
import numpy as np
from .. import model_type, type_traits, mpi
__all__ = ["step_dump", "directory_dump"]
_basic_types = [t for t, trait in type_traits.items() if trait.components == 1]
def _is_surface_field(field, model):
def _to_global(shape):
if len(shape) == len(model.boundary_shape) + 1:
return mpi.global_shape(model.boundary_shape) + [shape[-1]]
else:
return mpi.global_shape(shape)
b_shapes = [list(model[name].shape) for name in model.boundary_fields]
shape = list(field.shape)
return any(shape == s for s in b_shapes) \
or any(shape == _to_global(s) for s in b_shapes)
def local_slice(field, model):
n = model.shape
bn = model.boundary_shape
gshape = mpi.global_shape(bn)
offsets = np.zeros_like(gshape)
offsets[0] = mpi.local_offset(gshape)
if not _is_surface_field(field, model) and len(n) > len(bn):
gshape = [n[0]] + gshape
offsets = np.concatenate(([0], offsets))
shape = bn if _is_surface_field(field, model) else n
if len(field.shape) > len(shape):
shape += field.shape[len(shape):]
def sgen(offset, size):
return slice(offset, offset + size, None)
def sgen_basic(offset, size):
return slice(offset, offset + size)
slice_gen = sgen_basic if model_type in _basic_types else sgen
return tuple(map(slice_gen, offsets, shape))
def step_dump(cls):
"""
Decorator for dumper with counter for steps
"""
orig_init = cls.__init__
orig_dump = cls.dump
@wraps(cls.__init__)
def __init__(obj, *args, **kwargs):
orig_init(obj, *args, **kwargs)
obj.count = 0
def postfix(obj):
return "_{:04d}".format(obj.count)
@wraps(cls.dump)
def dump(obj, *args, **kwargs):
orig_dump(obj, *args, **kwargs)
obj.count += 1
cls.__init__ = __init__
cls.dump = dump
cls.postfix = property(postfix)
return cls
def directory_dump(directory=""):
"Decorator for dumper in a directory"
directory = Path(directory)
def actual_decorator(cls):
orig_dump = cls.dump
orig_filepath = cls.file_path.fget
+ orig_init = cls.__init__
+
+ @wraps(cls.__init__)
+ def init(obj, *args, **kwargs):
+ orig_init(obj, *args, **kwargs)
+ obj.mkdir = kwargs.get('mkdir', True)
@wraps(cls.dump)
def dump(obj, *args, **kwargs):
- if mpi.rank() == 0:
+ if mpi.rank() == 0 and getattr(obj, 'mkdir'):
directory.mkdir(parents=True, exist_ok=True)
orig_dump(obj, *args, **kwargs)
@wraps(cls.file_path.fget)
def file_path(obj):
- return str(directory / orig_filepath(obj))
+ if getattr(obj, 'mkdir'):
+ return str(directory / orig_filepath(obj))
+ return orig_filepath(obj)
+ cls.__init__ = init
cls.dump = dump
cls.file_path = property(file_path)
return cls
return actual_decorator
def hdf5toVTK(inpath, outname):
"""Convert HDF5 dump of a model to VTK."""
from . import UVWDumper, H5Dumper # noqa
UVWDumper(outname, all_fields=True) << H5Dumper.read(inpath)
def netCDFtoParaview(inpath, outname):
"""Convert NetCDF dump of model sequence to Paraview."""
from . import UVWGroupDumper, NetCDFDumper # noqa
dumper = UVWGroupDumper(outname, all_fields=True)
for model in NetCDFDumper.read_sequence(inpath):
dumper << model
def file_handler(mode):
"""Decorate a function to accept path-like or file handles."""
def _handler(func):
@wraps(func)
def _wrapped(self, fd, *args, **kwargs):
if isinstance(fd, (str, PathLike)):
with open(fd, mode) as fd:
return _wrapped(self, fd, *args, **kwargs)
elif isinstance(fd, io.TextIOBase):
return func(self, fd, *args, **kwargs)
raise TypeError(
f"Expected a path-like or file handle, got {type(fd)}")
return _wrapped
return _handler
diff --git a/tests/test_dumper.py b/tests/test_dumper.py
index 8a35857..18eeeb6 100644
--- a/tests/test_dumper.py
+++ b/tests/test_dumper.py
@@ -1,193 +1,201 @@
# -*- coding: utf-8 -*-
#
# Copyright (©) 2016-2022 EPFL (École Polytechnique Fédérale de Lausanne),
# Laboratory (LSMS - Laboratoire de Simulation en Mécanique des Solides)
# Copyright (©) 2020-2022 Lucas Frérot
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published
# by the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see .
from __future__ import division
import os
import pytest
import numpy as np
import tamaas as tm
from pathlib import Path
from numpy.testing import assert_allclose
from conftest import mtidfn, type_list
from tamaas.dumpers import NumpyDumper, JSONDumper, FieldDumper
from tamaas.dumpers._helper import directory_dump, step_dump
@pytest.fixture(scope="module",
params=type_list,
ids=mtidfn)
def model_fixture(request):
dim = tm.type_traits[request.param].dimension
model = tm.ModelFactory.createModel(request.param, dim * [1.], dim * [4])
model.displacement[:] = 3
model.traction[:] = 1
model['additional'] = 2 * np.ones(model.shape
+ [tm.type_traits[model.type].components])
return model
class Dumper(tm.ModelDumper):
"""Simple numpy dumper"""
def __init__(self):
tm.ModelDumper.__init__(self)
def dump(self, model):
np.savetxt('tractions.txt', np.ravel(model.traction))
np.savetxt('displacement.txt', np.ravel(model.displacement))
dumper_types = [NumpyDumper, JSONDumper]
try:
from tamaas.dumpers import H5Dumper
dumper_types.append(H5Dumper)
except ImportError:
pass
try:
from tamaas.dumpers import UVWDumper
dumper_types.append(UVWDumper)
except ImportError:
pass
try:
from tamaas.dumpers import NetCDFDumper
dumper_types.append(NetCDFDumper)
except ImportError:
pass
@pytest.mark.parametrize('dumper_class', dumper_types)
def test_dumper(model_fixture, tmp_path, dumper_class):
os.chdir(tmp_path)
filename = 'test_{}'.format(dumper_class.__name__)
if not issubclass(dumper_class, FieldDumper):
dumper = dumper_class(filename)
else:
dumper = dumper_class(filename, all_fields=True)
filename = dumper.file_path
dumper << model_fixture
# UVW does not read VTK files
try:
if dumper_class is UVWDumper:
with pytest.raises(NotImplementedError):
dumper.read(filename)
assert Path(filename).is_file()
return
except NameError:
pass
rmodel = dumper.read(filename)
def compare(model, reference):
assert model is not reference
assert model.type == reference.type
assert model.shape == reference.shape
for field in reference:
# assert field in model
assert_allclose(model[field], reference[field], rtol=1e-15)
compare(rmodel, model_fixture)
# Skipping sequence read for JSON
if dumper_class.__name__ == "JSONDumper":
return
dumper << model_fixture
count = 0
for rmodel in dumper.read_sequence(filename.replace("0000", "*")):
count += 1
compare(rmodel, model_fixture)
assert count == 2
def test_custom_dumper(tmp_path):
os.chdir(tmp_path)
model = tm.ModelFactory.createModel(tm.model_type.volume_2d, [1., 1., 1.],
[16, 4, 8])
dumper = Dumper()
dumper << model
tractions = np.loadtxt('tractions.txt')
displacements = np.loadtxt('displacement.txt')
assert_allclose(tractions.reshape(model.traction.shape), model.traction,
1e-15)
assert_allclose(displacements.reshape(model.displacement.shape),
model.displacement, 1e-15)
@pytest.fixture
def simple_model(scope="module"):
return tm.ModelFactory.createModel(tm.model_type.basic_2d, [1, 1], [1, 1])
def test_directory_dump(simple_model, tmp_path):
os.chdir(tmp_path)
@directory_dump("dummy")
class Dummy(tm.ModelDumper):
def __init__(self):
super(Dummy, self).__init__()
def dump(self, model):
pass
@property
def file_path(self):
return 'dummy'
dumper = Dummy()
dumper << simple_model
+ directory = Path('dummy')
- assert Path('dummy').is_dir()
+ assert dumper.mkdir
+ assert directory.is_dir()
+
+ # Testing turning the behavior off
+ directory.rmdir()
+ dumper.mkdir = False
+ dumper << simple_model
+ assert not directory.is_dir()
def test_step_dump(simple_model, tmp_path):
os.chdir(tmp_path)
@step_dump
class Dummy(FieldDumper):
extension = 'dummy'
def _dump_to_file(self, file_descriptor, model):
with open(file_descriptor, 'w'):
pass
files = []
dumper = Dummy('dummy')
def dump():
files.append(dumper.file_path)
dumper << simple_model
dump()
dump()
for file in files:
assert Path(file).is_file()