Multi-Objective Optimization google
Multi-objective optimization (also known as multi-objective programming, vector optimization, multicriteria optimization, multiattribute optimization or Pareto optimization) is an area of multiple criteria decision making, that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective optimization has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives. Minimizing cost while maximizing comfort while buying a car, and maximizing performance whilst minimizing fuel consumption and emission of pollutants of a vehicle are examples of multi-objective optimization problems involving two and three objectives, respectively. In practical problems, there can be more than three objectives. For a nontrivial multi-objective optimization problem, no single solution exists that simultaneously optimizes each objective. In that case, the objective functions are said to be conflicting, and there exists a (possibly infinite) number of Pareto optimal solutions. A solution is called nondominated, Pareto optimal, Pareto efficient or noninferior, if none of the objective functions can be improved in value without degrading some of the other objective values. Without additional subjective preference information, all Pareto optimal solutions are considered equally good (as vectors cannot be ordered completely). Researchers study multi-objective optimization problems from different viewpoints and, thus, there exist different solution philosophies and goals when setting and solving them. The goal may be to find a representative set of Pareto optimal solutions, and/or quantify the trade-offs in satisfying the different objectives, and/or finding a single solution that satisfies the subjective preferences of a human decision maker (DM). …

Vicious Circle Principle google
The vicious circle principle is a principle that was endorsed by many predicativist mathematicians in the early 20th century to prevent contradictions. The principle states that no object or property may be introduced by a definition that depends on that object or property itself. In addition to ruling out definitions that are explicitly circular (like ‘an object has property P iff it is not next to anything that has property P’), this principle rules out definitions that quantify over domains which include the entity being defined. Thus, it blocks Russell’s paradox, which defines a set S that contains all sets that don’t contain themselves. This definition is blocked because it defines a new set in terms of the totality of all sets, of which this new set would itself be a member. However, it also blocks one standard definition of the natural numbers. First, we define a property as being ‘hereditary’ if, whenever a number n has the property, so does n + 1. Then we say that x has the property of being a natural number if and only if it has every hereditary property that 0 has. This definition is blocked, because it defines ‘natural number’ in terms of the totality of all hereditary properties, but ‘natural number’ itself would be such a hereditary property, so the definition is circular in this sense. …

SAFE google
This paper presents a practical approach for detecting non-stationarity in time series prediction. This method is called SAFE and works by monitoring the evolution of the spectral contents of time series through a distance function. This method is designed to work in combination with state-of-the-art machine learning methods in real time by informing the online predictors to perform necessary adaptation when a non-stationarity presents. We also propose an algorithm to proportionally include some past data in the adaption process to overcome the Catastrophic Forgetting problem. To validate our hypothesis and test the effectiveness of our approach, we present comprehensive experiments in different elements of the approach involving artificial and real-world datasets. The experiments show that the proposed method is able to significantly save computational resources in term of processor or GPU cycles while maintaining high prediction performances. …

Advertisements