Search:

# AI Basics

#### Fuzzy Logic: Putting an "Almost" in Mathematics

One of the biggest problems with simulating thought with a computer, is the concreteness of logic.  To a computer, something is either true or false. There is no 'in-between' or 'almost'.  However, human brains work almost entirely in partial truths and generalizations.  They see 'warm', 'cool', 'balmy', 'freezing', 'chilly', and 'scalding', where a computer sees 'hot' and 'cold'.  Also, when making decisions, the world in which humans base decisions, is very 'fuzzy'-fast, slow, near, far, large, small.  In our world, nearly every parameter on which choices are based is general in nature and consequently imprecise.

Fuzzy Logic gets around this by accepting noisy, imprecise input.  This allows the Fuzzy logic process to draw conclusions based on incomplete, indefinite, and "noisy" components.  It can now be more generic and better mimic human decision-making processes. It is not limited by binary logic - a value isn't necessarily true or false.  It strives to imitate complex human reasoning in order to arrive at realistic conclusions about the imprecise and fuzzy nature of the world we live in.

In order to do this, an implementation of Fuzzy Logic accepts standard data, which is usually precise.  It then runs at least one, but usually several, 'if-then' rules using key word descriptors, combined with standard algorithmic programming techniques.  The key word descriptors make it clear to the programmer exactly what goes on in each step, and the code tells the computer what to do.  Once it cycles through all the applicable rules, it averages the outcomes and determines what it should do.  The outcome is transformed into a plain, "defuzzified" product, which is returned to whatever called the Fuzzy Logic algorithm to begin with.

This method allows it to adapt itself when it encounters conflicts, errors, and generalizations along the way.  The 'if-then' rules are able to average them out, making them insignificant.