Topology optimization (TO) is a mathematical method that optimizes material layout within a given design space, for a given set of loads, boundary conditions and constraints with the goal of maximizing the performance of the system. TO is different from shape optimization and sizing optimization in the sense that the design can attain any shape within the design space, instead of dealing with predefined configurations.

The conventional TO formulation uses a finite element method [FEM] to evaluate the design performance. The design is optimized using either gradient-based mathematical programming techniques such as the optimality criteria algorithm and the method of moving asymptotes or non gradient-based algorithms such as genetic algorithms.

Topology Optimization has a wide range of applications in aerospace, mechanical, bio-chemical and civil engineering. Currently, engineers mostly use TO at the concept level of a design process. Due to the free forms that naturally occur, the result is often difficult to manufacture. For that reason the result emerging from TO is often fine-tuned for manufacturability. Adding constraints to the formulation in order to increase the manufacturability is an active field of research. In some cases results from TO can be directly manufactured using additive manufacturing; TO is thus a key part of design for additive manufacturing.

This method derived from mathematics was clearly defined, explained and made usable for mechanics in the 2000s, notably with the founding article by Ole Sigmund.

Increasingly sophisticated topological optimization software enables engineers to save the material possible for an object while maintaining or improving its strength or flexibility (if necessary) and taking into account the constraints that will be placed on it., work formerly based on intuition, the method of trial and error and / or the genius of creators and / or manufacturing engineers.
A very simple example is the optimized reduction in the number of spokes of a bicycle wheel. Until now only simple forms were concerned, because these softwares are very greedy in calculation or were quickly limited by the complexity of the requested work
In October 2017, in the journal Nature, researchers from a Danish university present a method to do this work for large objects, improving the resolution possible (An image 2d is composed of pixels while a 3D image is composed of voxels. Until recently, the resolution of optimized 3D models was limited to 5 million voxels, but a new program optimizes objects up to 1 billion voxels, which allows for example to model and redesign by optimizing a wing. of Boeing 777 lighter by 5% while being reinforced from the inside by curved longitudinal and diagonal ribs rather than in a grid… with an expected saving of 200 tonnes of kerosene / year. This has required days of computation by a supercomputer and this design (which evokes the interior of some bones or inside parts of insect exoskeletons) is currently “unmanageable” but the progress of 3D printing could soon put it within our reach.

Problem statement
A topology optimization problem can be written in the general form of an optimization problem as:

The problem statement includes the following:

An objective function . This function represents the quantity that is being minimized for best performance. The most common objective function is compliance, where minimizing compliance leads to maximizing the stiffness of a structure.

The material distribution as a problem variable. This is described by the density of the material at each location . Material is either present, indicated by a 1, or absent, indicated by a 0.

The design space . This indicates the allowable volume within which the design can exist. Assembly and packaging requirements, human and tool accessibility are some of the factors that need to be considered in identifying this space . With the definition of the design space, regions or components in the model that cannot be modified during the course of the optimization are considered as non-design regions.

 constraints  a characteristic that the solution must satisfy. Examples are the maximum amount of material to be distributed (volume constraint) or maximum stress values.

Evaluating  often includes solving a differential equation. This is most commonly done using the finite element method since these equations do not have a known analytical solution.

Implementation methodologies
There are various implementation methodologies that have been used to solve TO problems. In mechanics, solving a topological optimization problem involves modeling the part, or the set of parts, to be optimized using the finite element method.. A classical method of topological optimization then consists in considering in every point of the optimization volume a density of matter varying between 0 and 1. Other methods consider the local orientation of the material (for nonisotropic materials) or even of other characteristics. In these methods, optimizing generally involves minimizing the strain energy of the structure, which amounts to roughly finding the most rigid structure possible. We can either set the amount of material used to highlight optimal forms, to guide a design and optimization made otherwise, or seek directly to define a form minimizing the material to implement to minimize the maximum structure, respecting a constraint not to exceed. In practice, and thresholding, in particular to impose specific geometrical constraints related to the manufacturing process (symmetries, authorization of hollow volume or not,…, joint plane).

The main steps and difficulties to overcome are generally the following:

Define the specifications of the part to design:
Really available space: it is often much larger than the possibly existing room, and can be further enlarged by putting back flat the function actually to fill and the constraints surrounding this room, or the set of rooms to redesign. We must not forget the areas where the material is imposed or prohibited (for functional or aesthetic reasons).
Mechanical connections with the environment: it is necessary to put flat the connections possible with the neighboring parts, because there is often much more freedom for the zones of fastenings than those envisaged a priori. It is sometimes not clear which areas to block or which zones are loaded by forces, the most pragmatic then being to imagine how the part could be tested on a test bench, with fixed links and jacks, for example.
Mechanical forces suffered: it is necessary to take into account all the mechanical loads seen by the part, beyond the main function, ie. efforts related to manufacturing steps (including machining), efforts related to the handling of the workpiece (assembly / disassembly of the workpiece, transport), accidental efforts (shocks), for example.
Symmetries and manufacturing conditions (this being increasingly taken into account by computer software).

Start the topological optimization calculation: the fineness of the mesh must be adapted to the desired spatial precision and to the computer resources available; the calculations can be long, so we try to make the first calculations on the scale of a few minutes, then refine them. It is also necessary to check how the different loading cases by the algorithm are taken into account. Indeed, if one just looks for the most rigid structure possible for a given mass, the energies of the different loads are simply summed, it is then necessary to weight them together, possibly. On the other hand, if the objective is to obtain the lightest possible piece that does not break, no need for weighting.
Analysis of the result: To show an easily understandable part (with well defined vacuum and fullness), the result is usually filtered by the software for display (eg full corresponds to areas of material density greater than 50%, otherwise it’s empty). It is therefore necessary to take into account that in general it is a more or less dense / porous matter which is really considered by the algorithm, and that the possible zones of matter not related to the rest are quite possible, at the same time. display, because they are linked to the rest by low-density material, not displayed. The result is therefore to pretend to define a piece made of empty and full, closer to what the algorithm proposes.

There are parameters (sometimes hidden) to explore these subtleties in detail: threshold of the material (default 50% in general), penalization (parameter limiting the zones of densities around 50%, but which can degrade the convergence of the algorithms), filtering / smoothing (filter allowing to eliminate details considered too small), and of course the fineness of the mesh (which allows to reveal more or less fine details). It is often realized at this stage that the form obtained is absurd, usually following the omission of a major constraint, or because the problem was poorly posed (for example if there are not enough connections to the frame to maintain the room, or because blockages or efforts have been applied to an area where the material is prohibited). but can degrade the convergence of the algorithms), filtering / smoothing (filter allowing to eliminate details considered too small), and of course the fineness of the mesh (which makes it possible to reveal more or less fine details).

It is often realized at this stage that the form obtained is absurd, usually following the omission of a major constraint, or because the problem was poorly posed (for example if there are not enough connections to the frame to maintain the room, or because blockages or efforts have been applied to an area where the material is prohibited). but can degrade the convergence of the algorithms), filtering / smoothing (filter allowing to eliminate details considered too small), and of course the fineness of the mesh (which makes it possible to reveal more or less fine details). It is often realized at this stage that the form obtained is absurd, usually following the omission of a major constraint, or because the problem was poorly posed (for example if there are not enough connections to the frame to maintain the room, or because blockages or efforts have been applied to an area where the material is prohibited).

Drawing and verification: once the interpretation of the results is consolidated, the piece can be drawn as close as possible to the topology obtained (number of bars / plates, orientation, relative thicknesses), but possibly more pleasing to the eye, because so-called “organic” forms obtained by topological optimization are not always suitable. This is why we sometimes impose a skin outside the room (the visible part) by limiting the topological optimization only inside the room to be lightened (invisible part). If possible, it is best to use lattices (ie, a tight network of beams or walls, such as foams), in order to put intermediate density material where the calculation makes it appear (c ‘

Continuous and discrete topology optimization
One can distinguish in continuous and discrete topology optimization. In continuous topology optimization, the material distribution in the installation space is sought. In discrete topology optimization, discrete elements are sought as coverage of the construction space. For example, an optimal framework can be searched, which ultimately represents a topology of the overall object.

Related Post

Continuous Topology Optimization
n practice, topology optimization is used in the design process to obtain proposals for initial designs of components. In doing so, the designer must first determine the maximum available space and the boundary conditions (loads and restraints). These data are converted into an FE model (FE = finite elements).

Basically, a distinction is made according to material and geometric topology optimization. In the geometric topology optimization, the geometry of the component is described by the shape of the outer boundary, ie the edges and surfaces. This also recesses are made within the component boundary and varied in shape. Material topology optimization describes the geometry of a part in the design space. Here, each finite element in the design space is assigned a density. For simple optimization algorithms, such as the optimality criteria (eg Fully Stressed Design), the density is set to either 0 or 100% like a simple on / off switch. Fully Stressed Design retains the elements that are stressed near the maximum allowable stress, so that at the end of the optimization almost every element of the FE mesh is fully exploited in terms of strength. Mathematical programming is an optimization algorithm that uses the partial derivatives of the objective function to determine the change of the individual parameters for the next iteration. Accordingly, there must be a continuous density distribution for differentiability. In the so-called Homogenisierungsmethode the change in density is described by a microscopic hollow body in each of the finite elements and then transferred via a non-linear, macroscopic material law in a change in the modulus of elasticity. As a result, the stresses and deformations of the component can be calculated. As a result of such topology optimization, you get a rugged, porous design model, which only offers help in finding a shape due to the bone-like structure and the neglect of manufacturing restrictions. One way to improve the result is to return the FE model to a smoothed oneCAD surface model. If necessary, manufacturing restrictions can also be taken into account.

Discrete topology optimization
One of the first topology optimizations was done by Anthony George Maldon Michell. But even today topology optimizations are carried out by trusses. The reason for this is the low computation time; although the proximity to reality is significantly more remote than in the case of continuous topology optimization.

Solving TO problems in a discrete sense is done by discretizing the design domain into finite elements. The material densities inside these elements are then treated as the problem variables. In this case material density of one indicates the presence of material, while zero indicates an absence of material. Due to the attainable topological complexity of the design being dependent of the amount of elements, a large amount is preferred. Large amount of finite elements increase the attainable topological complexity, but come at a cost. Firstly, solving the FEM system becomes more expensive. Secondly, algorithms that can handle a large amount (several thousands of elements is not uncommon) of discrete variables with multiple constraints are unavailable. Moreover, they are impractically sensitive to parameter variations. In literature problems with up to 30000 variables have been reported

Solving the problem with continuous variables
The earlier stated complexities with solving TO problems using binary variables has caused the community to search for other options. One is the modelling of the densities with continuous variables. The material densities can now also attain values between zero and one. Gradient based algorithms that handle large amounts of continuous variables and multiple constraints are available. But the material properties have to be modelled in a continuous setting. This is done through interpolation. One of the most implemented interpolation methodologies is the SIMP method (Solid Isotropic Material with Penalisation). This interpolation is essentially a power law . It interpolates the Young’s modulus of the material to the scalar selection field. The value of the penalisation parameter  is generally taken between . This has been shown to confirm the micro-structure of the materials. In the SIMP method a lower bound on the Young’s modulus is added, , to make sure the derivatives of the objective function are non-zero when the density becomes zero. The higher the penalisation factor, the more SIMP penalises the algorithm in the use of non-binary densities. Unfortunately, the penalisation parameter also introduces non-convexities).

Shape derivatives
Topological derivatives
Level set
Phase field
Evolutionary Structural Optimization
Commercial Software
There are several commercial topology optimization software on the market. Most of them use topology optimization as a hint how the optimal design should look like, and manual geometry re-construction is required. There are a few solutions which produce optimal designs ready for Additive Manufacturing.

Structural compliance
A stiff structure is one that has the least possible displacement when given certain set of boundary conditions. A global measure of the displacements is the strain energy (also called compliance) of the structure under the prescribed boundary conditions. The lower the strain energy the higher the stiffness of the structure. So, the problem statement involves the objective functional of the strain energy which has to be minimized.

On a broad level, one can visualize that the more the material, the less the deflection as there will be more material to resist the loads. So, the optimization requires an opposing constraint, the volume constraint. This is in reality a cost factor, as we would not want to spend a lot of money on the material. To obtain the total material utilized, an integration of the selection field over the volume can be done.

Finally the elasticity governing differential equations are plugged in so as to get the final problem statement.

subject to:

But, a straightforward implementation in the Finite Element Framework of such a problem is still infeasible owing to issues such as:

Mesh dependency—Mesh Dependency means that the design obtained on one mesh is not the one that will be obtained on another mesh. The features of the design become more intricate as the mesh gets refined.
Numerical instabilities—The selection of region in the form of a chess board.

Some techniques such as Filtering based on Image Processing are currently being used to alleviate some of these issues.

3F3D Form Follows Force 3D Printing
The current proliferation of 3D Printer technology has allowed designers and engineers to take advantage of topology optimization techniques when designing new products.

Topology Optimization combined with 3D Printing allows significant lightweighting, improved structural performance and shortened design-to-manufacturing cycle.

Multiphysics problems

Source from Wikipedia