Constraint Programming(ish) For Engineering Design

In summary, the conversation discusses the use of constraint programming in engineering design to find values that satisfy given constraints. The use of mathematical techniques such as linear programming, nonlinear programming, dynamic programming, and integer programming are mentioned as potential solutions to the problem. The conversation also suggests exploring resources such as WikiPedia and Google's OR-Tools for further learning. Additionally, the conversation delves into the specifics of the problem at hand, considering the types of constraints and variables involved, and suggests determining the type of problem before exploring potential solutions. Lastly, the conversation mentions techniques such as pattern-search algorithms for dealing with nonlinear constraints and objective functions that are not differentiable.
  • #1
person123
328
52
TL;DR Summary
I'm working on a tool for (civil) engineering design, and I was wondering if work in constraint programming could be useful.
I've been working on a tool in a browser for engineering (particularly civil engineering) design. By design, I just mean finding values (maybe the cross-section or material of a beam) which satisfies constraints. You would define constraints, possibly have something to minimize or maximize (like cost), and you could solve for possible values. The constraints could be:
  • Regular systems of equations
  • Inequalities
  • Equations with if statements (it's very common for empirical equations). They wouldn't be procedural: you could write "y=3 if x<0, y=2 otherwise" and "y=3" as two constraints and it would return "x<0"
  • Tables, again you could provide constraints for the value in the table

Just today I learned a bit about constraint programming, and it seems to have a lot of similarities. I'm curious if the work people have done in constraint programming could be applied to these sort of problems, particularly making it efficient (maybe clever algorithms instead of checking every possible option).

Thanks!
 
  • Like
Likes bobdavis
Computer science news on Phys.org
  • #2
Yes a great deal of work has been done in this field. To start scratching the surface you could try the WikiPedia page on constrained optimisation.

For a look at a well developed open source tool see Google's OR-Tools: https://developers.google.com/optimization

A search for "constrained optimisation engineering" led me to this page which seems to merit further investigation: http://apmonitor.com/me575/index.php/Main/HomePage

You can turn up more useful stuff by searching: note that I included "engineering" in the search because constrained optimisation is also used a great deal in economics, and whilst many of the techniques are identical, it is obviously easier to learn them through examples and exercises in a familiar domain. However this is a big topic and I suggest you focus down and follow a particular book or lecture notes to get an overview.
 
Last edited:
  • Like
  • Informative
Likes bobdavis, sysprog, berkeman and 1 other person
  • #3
The subjects of Linear Programming, Nonlinear Programming, Dynamic Programming, Integer Programming are all concerned with optimizing a variety of types of problems subject to constraints. It is a central concern in the field of Operations Research.
 
  • Like
Likes sysprog
  • #4
Thank you!

I looked at a bit of the textbook, and I'll look at it more when I focus more on optimization.

Also, thank you for linking to Google's OR Tools (it might be a bit overkill for what I'm doing, and I'm not sure how easy it would be to import for a website). I would mainly be interested in relatively simple algorithms for dealing with things like if statements. My approach right now would be to just go down all branches and then compare them, but I imagine there might be more efficient approaches.
 
  • #5
It's not clear the type of problem you're trying to optimize. But study of linear programming and the simplex method might be helpful. In linear programming, the optimum always lies on the boundaries defined by constraints.
 
  • Like
Likes sysprog
  • #6
person123 said:
I would mainly be interested in relatively simple algorithms for dealing with things like if statements. My approach right now would be to just go down all branches and then compare them, but I imagine there might be more efficient approaches.
This sounds very vague. You might be dealing with an integer programming problem ("if -- then -- else" converts to a variable equaling 0, 1, 2). There are some "branch and bound" algorithms that might help, but integer programming problems are a lot harder to deal with than many other types.
 
  • Like
Likes sysprog
  • #7
This may seem obvious, but it's important to remember that the optimal solution is always a subspace of the feasible region. Sometimes it's more efficient to first eliminate some infeasible value sets en route to finding the optimal instead of first formulating a system of inequalities as a standard LP problem. In some cases, e.g. when there is a confluence of linear and non-linear constraints, as in some instances of multi-variant regression analysis, your code has to be problem-specific, as distinguished from generally methodological.
 
  • #8
FactChecker said:
This sounds very vague. You might be dealing with an integer programming problem ("if -- then -- else" converts to a variable equaling 0, 1, 2). There are some "branch and bound" algorithms that might help, but integer programming problems are a lot harder to deal with than many other types.

I realize it was very vague: here's a more complete explanation. Let's say you had these three constraints:

a=1 if b<3
a=2 if b>=3
a=2

If solved for b, it would return b>=3, as that satisfies the constraints (if b were less than 3, a would have to equal 1 and you'd get a contradiction).

How I would do it now is combine these constraints with AND statements. Each if statement converts to an AND and OR combo. This gives:

((a=1 AND b<3) OR (b>=3)) AND
((a=2 AND b>=3) OR (b<3)) AND
(a=2)

When distributing it out:

(a=1 AND b<3 AND a=2 AND b>=3 AND a=2) OR
(a=1 AND b<3 AND b<3 AND a=2) OR
(b>=3 AND a=2 AND b>=3 AND a=2) OR
(b>=3 AND b<3 AND a=2)

Removing contradicting options leaves just the third option:
(b>=3 AND a=2 AND b>=3 AND a=2)

Which reduces to:
b>=3 AND a=2

Giving the expected result! However for lots of if statements (especially nested if statements), this could become very inefficient I think. I imagine there might be a way of doing it without going through every possibility.
 
Last edited:
  • #9
You should determine what kind of problem you are dealing with before you worry about ways to solve it.
1) Are all the constraints and the objective function linear?
2) Are the solution variables allowed to be real numbers or do they have to be integers?
Please characterize the problem as well as you can. You should read enough to be able to categorize your problem.
 
  • #10
The constraints and the objective function could be nonlinear, and the solution variables do not have to be integers. The constraints could be equalities or inequalities. I am assuming there's only one local optima (and if there's multiple, I would be fine with just finding one).
 
  • #11
It is a constrained nonlinear programming problem. Are all the constraints and the objective function differentiable? If so, there are techniques that can use the derivatives. Otherwise, there are some pattern-search techniques.
 
  • #12
FactChecker said:
It is a constrained nonlinear programming problem. Are all the constraints and the objective function differentiable? If so, there are techniques that can use the derivatives. Otherwise, there are some pattern-search techniques.

Yes, the constraints and objective functions are differentiable. I guess an exception for of this is when there's a discrete set of options.

For example, say you have three materials A, B, and C with a given strength and density. The strength must be greater than 3 and you want the material with the minimum density.

MaterialStrengthDensity
A210
B530
C420

The result would be material C. I would use the same approach as the if statements, where I would go through each option and check if it's feasible. Then for the feasible ones, I would find the optimal option.

I also should have clarified earlier that I'm using sympy as my computer algebra system, which has tools for constrained nonlinear problems. This is why I was more concerned with the problems with discrete options as opposed to solving nonlinear constraints.
 
  • #13
person123 said:
I also should have clarified earlier that I'm using sympy as my computer algebra system
Why are you doing this? When we write programs to solve the same problem many times we normally translate the constraints of the problem into the language of the program we are using, and we write unit tests to ensure that we have done this properly.

For your example this is one way we might do this.
Python:
def solve_for_b(constraints):
    # a=1 if b<3
    if constraints['a'] == 1:
        return { 'ge': 3 }
    # a=2 if b>=3
    if constraints['a'] == 2:
        return { 'lt': 3 }
    return {'any': True}

# Tests.
constraints = {'a': 1}
assert solve_for_b(constraints) == { 'ge': 3 }, \
    'when a is 1 b should be >= 3'

constraints = {'a': 2}
assert solve_for_b(constraints) == { 'lt': 3 }, \
    'when a is 2 b should be < 3'

constraints = {'a': 3}
assert solve_for_b(constraints) == { 'any': True }, \
    'when a is 3 b should be any'
 
  • #14
pbuk said:
Why are you doing this? When we write programs to solve the same problem many times we normally translate the constraints of the problem into the language of the program we are using, and we write unit tests to ensure that we have done this properly.

For your example this is one way we might do this.
Python:
def solve_for_b(constraints):
    # a=1 if b<3
    if constraints['a'] == 1:
        return { 'ge': 3 }
    # a=2 if b>=3
    if constraints['a'] == 2:
        return { 'lt': 3 }
    return {'any': True}

# Tests.
constraints = {'a': 1}
assert solve_for_b(constraints) == { 'ge': 3 }, \
    'when a is 1 b should be >= 3'

constraints = {'a': 2}
assert solve_for_b(constraints) == { 'lt': 3 }, \
    'when a is 2 b should be < 3'

constraints = {'a': 3}
assert solve_for_b(constraints) == { 'any': True }, \
    'when a is 3 b should be any'

I'm planning on allowing the user to type in the constraints in a WYSIWYG equation editor so I can't just code the particular constraints directly in. I also don't want a set of constraint to be only allowed to solve for one thing, so in the example I would also want to be able to solve for a given a constraint for b.

I plan on writing my own code for converting the if statements to OR statements and distributing them out. Then I would run each option through sympy to see if it gives a solution. In this case evaluating the options is very simple, but in general there may be a set of nonlinear constraints, and I don't want to write my own computer algebra system capable of solving nonlinear equations (I would have no idea how).

In general, I would write code which would work with searching a discrete set options which the user inputs, and let sympy do the symbolic evaluation.
 
Last edited:
  • #15
IMHO, writing a general tool that can do a large variety of constrained optimizations is a very ambitious goal. Many experts in the field have not even tried. For a person like you, with no real expertise in optimization to attempt such a thing is being optimistic.
 
  • Like
Likes pbuk
  • #17
FactChecker said:
IMHO, writing a general tool that can do a large variety of constrained optimizations is a very ambitious goal. Many experts in the field have not even tried. For a person like you, with no real expertise in optimization to attempt such a thing is being optimistic.
Is the issue doing it at all or making it efficient? A quick search shows how to do nonlinear optimization on sympy From my naive view, I don't see why I couldn't use this code which takes a set of constraints and an optional objective function and perform the optimization, even if it is inefficient for the particular problem.

anorlunda said:
Real world problems are far beyond what a person can type in by hand. The article below describes power grid optimizations with as many as 250K unknowns, and 150K constraints. They are solved using linear programming libraries.
https://www.physicsforums.com/insights/renewable-energy-meets-power-grid-operations/

I'm just planning on it being for relatively simple problems which could be done by hand with trial and error. They would be the kind of problems I see in civil undergrad: finding the ideal beam cross-section for a particular load, designing a breakwater which satisfies a bunch of empirical equations, designing a mat foundation etc. There might be 10-20 or so constraints maximum. However, these sorts of problems (as far as I can tell) are done very often in the real world.
 
Last edited:
  • #18
pbuk said:
Why are you doing this? When we write programs to solve the same problem many times we normally translate the constraints of the problem into the language of the program we are using, and we write unit tests to ensure that we have done this properly.
For if statements, tables, and other discrete problems, I'm planning on writing code (I would use the approach I showed where I convert them to AND and OR statements). For solving systems of nonlinear equations, however, I plan to just use sympy.
 
  • #19
FactChecker said:
IMHO, writing a general tool that can do a large variety of constrained optimizations is a very ambitious goal. Many experts in the field have not even tried. For a person like you, with no real expertise in optimization to attempt such a thing is being optimistic.
I started working on it in MATLAB which has a function (fmincon) for nonlinear constrained optimization. I had it optimize a truss geometry for different loading conditions (minimize the total truss length), but the code used could then also be used to, for example, optimize the bar cross-sections (minimum area that can support the loads) by simply plugging in a different set of constraints as text. Since optimizing truss geometry is probably the most ambitious optimization I would imagine using it for, I think it could be used for a variety of other applications for more basic engineering design. I still haven't implemented if statements

optimized_truss.png
optimized_truss_asy.png
 
  • #20
If you are thinking about making an automated tool for your own use, then I would encourage you. As you encounter new optimization problems you can automate what you want and do the rest by hand. Nobody can complain about that and you may find it to be very satisfying. But if you are planning to make something for general use by others, then you may find that there are a huge variety of problems that people will come up with, each one requiring special code and algorithms. You may get tired of that quickly.
There are a number of types of optimization problems:
linear, nonlinear, mixed
integer, continuous, mixed
differentiable and non-differentiable objective and constraint functions
constrained, unconstrained
quadratic
You have a certain subset in mind and will be able to satisfy your own needs, but you may be amazed at what other users would expect from your program.
 
  • #21
FactChecker said:
If you are thinking about making an automated tool for your own use, then I would encourage you. As you encounter new optimization problems you can automate what you want and do the rest by hand. Nobody can complain about that and you may find it to be very satisfying. But if you are planning to make something for general use by others, then you may find that there are a huge variety of problems that people will come up with, each one requiring special code and algorithms. You may get tired of that quickly.
There are a number of types of optimization problems:
linear, nonlinear, mixed
integer, continuous, mixed
differentiable and non-differentiable objective and constraint functions
constrained, unconstrained
quadratic
You have a certain subset in mind and will be able to satisfy your own needs, but you may be amazed at what other users would expect from your program.
I'm thinking of limiting it to structural analysis which is basic enough for simulations and advanced software to be unnecessary. I would like to use it for my own purposes where I'm working, but I would hope it could be used by other people doing similar work.
 
  • Like
Likes FactChecker

FAQ: Constraint Programming(ish) For Engineering Design

What is constraint programming?

Constraint programming is a type of problem-solving approach that involves representing a problem in terms of constraints and finding a solution that satisfies those constraints. It is commonly used in engineering design to optimize designs and find the best solution for a given set of constraints.

How does constraint programming differ from other problem-solving methods?

Unlike other problem-solving methods that focus on finding a single solution, constraint programming focuses on finding a set of solutions that satisfy a given set of constraints. This allows for more flexibility and can lead to more optimal solutions.

What are the benefits of using constraint programming in engineering design?

Constraint programming can help engineers optimize designs and find the best solution for a given set of constraints. It can also help identify trade-offs between different design objectives and assist in decision making.

What are the limitations of constraint programming?

Constraint programming can be limited by the complexity of the problem and the quality of the constraints. It may also require significant computational resources to find optimal solutions for large and complex problems.

Are there any real-world applications of constraint programming in engineering design?

Yes, constraint programming is widely used in engineering design for various applications such as scheduling, resource allocation, and project planning. It has also been applied to design optimization problems in fields such as aerospace, automotive, and manufacturing industries.

Similar threads

Back
Top