The use of fuzzy logic in classification
2019, MATEC Web of Conferences
https://doi.org/10.1051/MATECCONF/201929204008…
6 pages
1 file
Sign up for access to the world's latest research
Abstract
When determining the degree of coincidence of any multi-feature obtained information, received in the form of a fuzzy vector, to a pre-established known pattern, two general steps should be followed. The first step is to eliminate the features that have little or no effect to the final results and to maintain only those that will influence the pattern recognition. This step could be defined as the classification process and is imperative for the simplification of the problem. One example of classification that could considerably reduce system costs is when using sensors distributed along an industrial process to manage information at a central location. Several methods could be used for classification, such as statistical methods, rough sets, fuzzy logic or information theory. The second step is to find out the correlation between the received fuzzy vector and the vector defining the known pattern using the previously selected features. For this part, the use of fuzzy logic is extre...






Related papers
Advances in Fuzzy Systems, 2013
The theory of fuzzy logic is based on the notion of relative graded membership, as inspired by the processes of human perception and cognition. Lotfi A. Zadeh published his first famous research paper on fuzzy sets in 1965. Fuzzy logic can deal with information arising from computational perception and cognition, that is, uncertain, imprecise, vague, partially true, or without sharp boundaries. Fuzzy logic allows for the inclusion of vague human assessments in computing problems. Also, it provides an effective means for conflict resolution of multiple criteria and better assessment of options. New computing methods based on fuzzy logic can be used in the development of intelligent systems for decision making, identification, pattern recognition, optimization, and control.
2008
Data security and privacy are very important issues in the success of a business operation. Implementing and applying policies related to data security and privacy therefore has become one of the core and important activities for large organizations. Data classification process allows companies to organize their data according to their needs. This process can be laborious in large organizations with significant content to evaluate and categorize. Using a data classification process organizations can identify and apply appropriate setting and polices such as private access control and encryption requirements only to the relevant data thereby saving time and processing power. This paper explores the use of fuzzy logic in classification of data and suggests a method that can determine requirements for data security and privacy in an organization based on organizations needs and government policies imposed on data. A Case study is considered to present the effectiveness of the proposed method.
Springer eBooks, 2005
Preface vii 1 Pattern Recognition 1 1.1 Fuzzy models for pattern recognition 1 1.2 Why fuzzy pattern recognition? 1.3 Overview of the volume 8 1.4 Comments and bibliography 2 Cluster Analysis for Object Data 2.1 Cluster analysis 2.2 Batch point-prototype clustering models A. The c-means models B. Semi-supervised clustering models 23 C. Probabilistic Clustering D. Remarks on HCM/FCM/PCM E. The Reformulation Theorem 2.3 Non point-prototype clustering models A. The Gustafson-Kessel (GK) Model B. Linear manifolds as prototypes C. Spherical Prototypes D. Elliptical Prototypes E. Quadric Prototypes F. Norm induced shell prototypes G. Regression models as prototypes H. Clustering for robust parametric estimation 2.4 Cluster Validity A. Direct Measures B. Davies-Bouldin Index C. Dunn's index D. Indirect measures for fuzzy clusters E. Standardizing and normalizing indirect indices F. Indirect measures for non-point prototype models G. Fuzzification of statistical indices 2.5 Feature Analysis 2.6 Comments and bibliography vi FUZZY PATTERN RECOGNITION 3 Cluster Analysis for Relational Data 3.1 Relational Data A. Crisp Relations B. Fuzzy Relations 3.2 Object Data to Relational Data 3.3 Hierarchical Methods 3.4 Clustering by decomposition of fuzzy relations 3.5 Relational clustering with objective functions A. The Fuzzy Non Metric (FNM) model B. The Assignment-Prototype (AP) Model C. The relational fuzzy c-means (RFCM) model D. The non-Euclidean RFCM (NERFCM) model 3.6 Cluster validity for relational models 3.7 Comments and bibliography 4 Classifier Design 4.1 Classifier design for object data 4.2 Prototype classifiers A. The nearest prototype classifier B. Multiple prototype designs 4.3 Methods of prototype generation A. Competitive learning networks B. Prototype relabeling C. Sequential hard c-means (SHCM) D. Learning vector quantization (LVQ) E. Some soft versions of LVQ F. Case Study : LVQ and GLVQ-F 1-nmp designs G. The soft competition scheme (SCS) H. Fuzzy learning vector quantization (FLVQ) 1. The relationship between c-Means and CL schemes J. The mountain "clustering" method (MCM) 4.4 Nearest neighbor classifiers 4.5 The Fuzzy Integral 4.6 Fuzzy Rule-Based Classifiers A. Crisp decision trees B. Rules from crisp decision trees C. Crisp decision tree design D. Fuzzy system models and function approximation E. The Chang -Pavlidis fuzzy decision tree F. Fuzzy relatives of 1D3 G. Rule-based approximation based on clustering H. Heuristic rule extraction I. Generation of fuzzy labels for training data 4.7 Neural-like architectures for classification A. Biological and mathematical neuron models B. Neural network models C. Fuzzy Neurons D. Fuzzy aggregation networks E. Rule extraction with fuzzy aggregation networks Contents vii 4.8 Adaptive resonance models A. The ARTl algorithm B. Fuzzy relatives of ART C. Radial basis function networks 4.9 Fusion techniques A. Data level fusion B. Feature level fusion C. Classifier fusion 4.10 Syntactic pattern recognition A. Language-based methods B. Relation-based methods 4.11 Comments and bibliography 5 Image Processing and Computer 'Vision 5.
Data Mining and Knowledge Discovery Handbook
In this chapter we describe some basic concepts from fuzzy logic and how their applicability to Data Mining. First we discuss some basic terms from fuzzy set theory and fuzzy logic. Then, we provide examples that show how fuzzy sets and fuzzy logic can be applied best to discover knowledge from a given database.
Advances in Fuzzy Systems, 2013
The theory of fuzzy logic is based on the notion of relative graded membership, as inspired by the processes of human perception and cognition. Lotfi A. Zadeh published his first famous research paper on fuzzy sets in 1965. Fuzzy logic can deal with information arising from computational perception and cognition, that is, uncertain, imprecise, vague, partially true, or without sharp boundaries. Fuzzy logic allows for the inclusion of vague human assessments in computing problems. Also, it provides an effective means for conflict resolution of multiple criteria and better assessment of options. New computing methods based on fuzzy logic can be used in the development of intelligent systems for decision making, identification, pattern recognition, optimization, and control.
2004
The purpose of this paper is to study the problem of pattern classification as this is presented in the context of data mining. Among the various approaches we focus on the use of Fuzzy Logic for pattern classification, due to its close relation to human thinking. More specifically, this paper presents a heuristic fuzzy method for the classification of numerical data, followed by the design and the implementation of its corresponding tool (Fuzzy Miner). The initial idea comes from the fact that fuzzy systems are universal approximators of any real continuous function. An approximation method coming from the domain of fuzzy control is appropriately adjusted into pattern classification and an "adaptive" procedure is proposed and developed for deriving highly accurate linguistic if-then rules. Extensive simulation tests are performed to demonstrate the performance and advantages of Fuzzy Miner, as well as its potential commercial benefits over a real world scenarion. †
Annual Conference of the North American Fuzzy Information Processing Society - NAFIPS, 1999
In this paper we analvze the wncept of fuzzy partition, starting b m the C h~S i d key definition given by Rwpini. Our main claim is that such a definition is too astrictive, since it assumes a particular set of Cklsse.9 that in pmctice may be reached only after a long learning p m s s. As a wwequence, some principles to be taken into account in fuzzy classification methods arre discussed.
Pattern Recognition, 1999
In fuzzy classi"er systems the classi"cation is obtained by a number of fuzzy If}Then rules including linguistic terms such as Low and High that fuzzify each feature. This paper presents a method by which a reduced linguistic (fuzzy) set of a labeled multi-dimensional data set can be identi"ed automatically. After the projection of the original data set onto a fuzzy space, the optimal subset of fuzzy features is determined using conventional search techniques. The applicability of this method has been demonstrated by reducing the number of features used for the classi"cation of four real-world data sets. This method can also be used to generate an initial rule set for a fuzzy neural network.
Development of technological processes automation in all branches of manufacturing requires real time solutions of many complex diagnosis and optimisation problems when information is incomplete, uncertain and fuzzy. The solutions should be independent on human (expert) intervention. As a consequence in recent years artificial intelligence methods have reached an important role in automating technological processes. Special attention of researches and designers of automated systems is focused on fuzzy inference, which becomes the most popular among probabilistic, fuzzy, inductive and deductive inference mechanisms. Methods of uncertain information analysis combined with methods typical to fuzzy logic theory representing imprecise information better describe many problems such as: propagation and accumulation of uncertainty in technical objects functioning.
Journal of Fundamental & Comparative Research, 2023
Fuzzy logic is tolerant of imprecision, uncertainty, partial truth, and approximation. The basic ideas underlying soft computing in its current form is influenced by Zadeh's 1965 paper on fuzzy sets. Systems based on fuzzy logic are becoming popular in industry, business, defence. Medical, and many more. The successful applications of fuzzy logic suggest that the impact of fuzzy logic will increase in coming years. Fuzzy logic is likely to play an important role in science and engineering, but its influence may extend much farther. Fuzzy logic has provided to be an excellent choice for the many control system application. In the present competitive scenario the fuzzy logic system are being adopted for the improvement of the quality and reduction of development time and the cost of a product. Fuzzy logic has proven to be an excellent choice for many control system applications. In this research paper authors studied various applications of fuzzy logic in industrial processes and in various segments.
The use of fuzzy logic in classification
Amaury Caballero 1,∗
1 Department of Electrical & Computer Engineering
Florida International University. Miami, USA
Abstract
When determining the degree of coincidence of any multi-feature obtained information, received in the form of a fuzzy vector, to a pre-established known pattern, two general steps should be followed. The first step is to eliminate the features that have little or no effect to the final results and to maintain only those that will influence the pattern recognition. This step could be defined as the classification process and is imperative for the simplification of the problem. One example of classification that could considerably reduce system costs is when using sensors distributed along an industrial process to manage information at a central location. Several methods could be used for classification, such as statistical methods, rough sets, fuzzy logic or information theory. The second step is to find out the correlation between the received fuzzy vector and the vector defining the known pattern using the previously selected features. For this part, the use of fuzzy logic is extremely convenient. The present work analyzes some of the methods used for classification and pattern recognition based on concrete and practical examples
1. Introduction
When analyzing an information system or a database, frequently we face problems like attributes redundancy, missing or diffuse values, which are due in general to noise, and missing partial data. Several approaches for minimizing the number of attributes necessary to represent the desired category structure by eliminating redundancy have been developed. The lack of data or complete knowledge of the system makes, when developing a model, a practically impossible task using conventional means. This lack of data can be attributed to sensors failure, or simple due to incomplete system information. At last, diffuse values could be related to noise or imprecise measurements from sensors. In many applications, the information is collected from different sensors, which are corrupted by noise and outliers. Different methods have been presented [1,2,3,4,5,6]. The present work is devoted to the use of fuzzy logic only as instrument for solving this problem.
For dealing with interval-valued information systems one frequently used procedure is the discretization. Yee Leung et al. [1] have presented a very useful method for obtaining rules, based on rough sets for discriminating the minimum number of attributes, and giving a first approach in the objects classification. Some aspects of the method are reproduced here.
2. Attribute reduction
Procedure:
- Table preparation: From the original table, a new one presenting the minimum and maximum values for each parameter and for each object is generated.
- Define the misclassification rates (αijk), The region of coincidence or misclassification rate of the two attributes may vary from 0 when there is not coincidence at all, to 1 when the two attributes coincide completely.
- Table preparation: From the original table, a new one presenting the minimum and maximum values for each parameter and for each object is generated.
- Define the misclassification rates (αijk), The region of coincidence or misclassification rate of the two attributes may vary from 0 when there is not coincidence at all, to 1 when the two attributes coincide completely.
In general, the probability that objects in class ui are misclassified into class uj according to attribute k can be represented by [1]:
αijk=0
if [1ik,uik]⋂[1jk,ujk]≡α;
αijk=min{(uik−ljk,ujk−lik)/(uik
−1ik)},1},
if [1ik,uik]⋂[1jk,ujk]=0.
Where α is the permissible misclassification rate which gives the permitted error in the classification.
Note that in general
αijk=αijk.
3. Define αij as the error that class ui being misclassified into class uj in the system k.
αij=min{αijk:k≤m}, where m is the maximum number of attributes
4. Find out the maximum mutual classification error between classes (βijk)
βijk=max{αijk,αijk}
where βijk=βijk
5. For each pair of classes, find out the permissible misclassification rate between classes ui&uj in the system k
βij=minβijk for 1≤k≤m
Let’s define the permissible misclassification rate between classes as α. If βij≤α, there must exist an attribute Ak so that, by using Ak, the two classes Ui and Uj can be separated within the permissible misclassification rate α.
3. Attribute Reduction Using Information Theory
Several methods using information theory have been developed. The presented method here is developed in detail in [6]. The general procedure is as follows: It results logical to think that when the two classes coincide for some parameter k, the information obtained from this parameter for discriminating between classes i and j is 0 , and that it increases as the coincidence diminishes. This leads to the representation of this information, from Shannon and Hartley definition, using a logarithmic scale. Here the logarithm is used to provide the additivity characteristic for independent uncertainty.
For expressing it with logarithms base 10 , it is given as
Iijk=−(logβijk) [Hartley]
Similarly, the minimum information required for the classification between two classes i and j for an attribute k and permissible misclassification rate α is given by
Lk=L=−logα[ Hartley ].
If Iijk≥Ixk, the two classes can be separated using the attribute k . An algorithm has been created for solving this task.
4. Fuzzy logic classification
The major task of fuzzy-based pattern classification is the extraction of knowledge from numerical data to build a rule base, which will permit the classification of new data members. One way of calculating the similarity is given below [7,8] :
Let P∗(X) be a group of fuzzy sets with Ai=0, and Ai=X. Defining two fuzzy sets from this family of sets, A,B∈P∗(X), the expression
(A,B)=(A∙B)∧(A⊕B)
describes the degree of similarity of the two sets A and B. When the approaching degree approaches unity, the two sets will have higher degree of similarity, and an approaching degree near zero implies the distinction between the two fuzzy sets.
Defining a new data sample B with m fuzzy attributes, the approaching degree concept can be applied to compare the new data pattern
B={B1, B2,…, Bm}
with some known data pattern Ai. Each of the known patterns Ai is characterized by the same m attributes and given by
Ai={Ai1,Ai2,…,Aim}
where i=1,2,…,k describes k-patterns.
For each of the known k-patterns, the approaching degree expression is given by
(B,Ai)=j=1∑mωj(Bj,Aij)
where ωj is a normalizing weighting factor, taken unitary in this work.
Then sample B is closest to sample Aj if
(B,Aj)=1≤i≤kmax{(B,Ai)}
The collection of fuzzy sets
B={B1,B2,…,Bm}
can be reduced to a collection of crisp singletons, B= {x1,x2,…,xm} where each sample (xi) is a vector of features,
xi={xi1,xi2,xi3,…,xim}
and the above mentioned weighted equation can be expressed as [12]
μAj(x)=j=1∑mωj⋅μAij(xj)
If it is considered that the different attributes are not of equal importance, then relative weights must be calculated.
The method consists in creating a fuzzy model for each parameter based on the previous obtained information, and for each new received sample find out the degree of compatibility (IC), given by the previous equation, with each class. The maximum obtained from this expression is the class to which the received information belongs to. The previous concepts are applied in the following examples.
5. Abalone database
This database [9] has been created in order to predict the age of abalone from physical measurements. Here it is used only like one example, and the obtained conclusions not necessarily reflect all the possibilities that could be obtained in the classification.
The authors selected 150 entries from the database and divided them in three groups: Group I, from 6 to 10 years old, Group II, from 11 to 15 years old, and Group III, from16 to 20 years old. The attributes are as follows: A- Length, B- Diameter, CHeight, D- Whole Weight, E- Shucked Weight, FViscera Weight, and G- Shell Weight. The elaborated information is presented in Table 1.
From Table 2, showing the maximum mutual classification error between classes in logarithmic form, it is clear that is not possible to discriminate between groups II and III, using the given attributes ∑k=2mIijk<Iij ) and the given error of misclassification α=0.2. It is possible to discriminate between groups I and III, using attributes D or A. Finally, there is a possibility of discriminating between groups I and II, using fuzzy logic and practically all the attributes. But, in general as per the obtained results in the example, the conclusion could be that the selected attributes in this case are not useful for determining the abalone age, using intervals of 5 years. As per Table 2, the discrimination between all the selected classes can be done with an error of 0.9 , which is in general not acceptable.
Table 1. Attributes for Each Class
Attribute | Group I | |||
---|---|---|---|---|
xav | σ | Min | Max | |
A | 0.46 | 0.09 | 0.27 | 0.59 |
B | 0.37 | 0.06 | 0.2 | 0.47 |
C | 0.12 | 0.02 | 0.07 | 0.18 |
D | 0.54 | 0.24 | 0.12 | 1.06 |
E | 0.22 | 0.37 | 0.05 | 0.49 |
F | 0.12 | 0.06 | 0.03 | 0.27 |
G | 0.17 | 0.07 | 0.07 | 0.34 |
Attribute | Group II | |||
xav | σ | Min | Max | |
A | 0.57 | 0.06 | 0.38 | 0.72 |
B | 0.46 | 0.05 | 0.31 | 1.06 |
C | 0.16 | 0.03 | 0.10 | 0.24 |
D | 1.04 | 0.37 | 0.28 | 2.55 |
E | 0.41 | 0.15 | 0.11 | 1.07 |
F | 0.23 | 0.08 | 0.07 | 0.54 |
G | 0.32 | 0.12 | 0.10 | 0.76 |
Attribute | Group III | |||
xav | σ | Min | Max | |
A | 0.62 | 0.05 | 0.54 | 0.74 |
B | 0.50 | 0.04 | 0.42 | 0.58 |
C | 0.19 | 0.02 | 0.16 | 0.24 |
D | 1.43 | 0.41 | 0.94 | 2.50 |
E | 0.53 | 0.25 | 0.35 | 0.93 |
F | 0.29 | 0.08 | 0.17 | 0.49 |
G | 0.47 | 0.15 | 0.27 | 0.78 |
Table 2. Mutual Classification Error between Classes Ui and Uj, expressed in Hartley for the Abalone
Ikk | U12 | U13 | U23 | Σ |
---|---|---|---|---|
A | 0.18 | k | 0.05 | 0.23 |
B | 0.23 | 0.51 | 0 | 0.74 |
C | 0.14 | 0.6 | 0 | 0.74 |
D | 0.08 | k | 0 | 0.08 |
E | 0.07 | 0.06 | 0 | 0.13 |
F | 0.07 | 0.07 | 0 | 0.14 |
G | 0.05 | 0.58 | 0.04 | 0.68 |
∑k=1mIkk | 0.82 | 0.09 |
Figure 1 a) and b) show the membership functions for groups I and II for the attributes A, B, C, and D. Parameter D was presented only for making obvious that it can’t be used for discrimination.
Three values from the group I have been taken aleatory from the abalone table [9] like examples, and calculated with MATLAB the their compatibility index (CI) when using the membership functions of group I and group II. From Table 3 is clear that the CI between the selected examples from group 1 , when calculated with the group I G1-G1) is much bigger the CI when calculated with group I (G1-G2)I.
From Table 3 was clear that for comparing group 1 and Group II the quantity of information for discriminating is very small, so the analysis has been focused on parameters A, B, and C
Table 3. Compatibility Index
Example | G1-G1 | G1-G2 |
---|---|---|
1 | 0.34 | 0.054 |
2 | 0.98 | 0.22 |
3 | 0.98 | 0.362 |
It is important to say that here is not made a deep analysis of the abalone classification. The intention has been to present the possibilities of the method and the database is used only like one example.
6. Vertical handoff target selection in a heterogeneous wireless network
The information for this example has been taken basically from [10], Variations are introduced in the way of solving the problem. Wireless and mobile networking is becoming an increasingly important and
popular way to provide global information access to users on the move.
The handoff process has two major stages: handoff initiation, and handoff execution[2]. In the handoff initiation phase, a decision is made regarding the selection of the new Base Station (BS), or Access Point (AP), to which the Mobile Station (MS) will be transferred. In the execution phase, new radio links are formed between the BS/AP and MS, and resources are allocated [10 ]
There exist several methods for solving this problem [10]. Here is presented a simple method for the selection of the best network based in the RSS, the velocity, and the cost.
The following table 4 shows the input parameters for the selection. There are four (target networks) alternatives A1, A2, A3, and A4 from which it is necessary to select an optimum target network for the user
Table 4. Input Parameters
Network | A1 | A2 | A3 | A4 |
---|---|---|---|---|
RSS (dbm) | -87 | -93 | -83 | -98 |
Velocity (Mbps) | 90 | 100 | 82 | 50 |
Cost | 52 | 42 | 38 | 30 |
Three Decision Makers (DM) with the different voting power are used. The DM1 have 41% of the voting power (ζ1=0.41) and DM2 have 34% of the voting power (ζ2=0.34) and the DM3 have 25% of the voting power (ζ3=0.25) respectively.
The desired condition for each decision maker is shown in Table 5, where the selected fuzzy variables are: very low (VL), low(L), medium (M), high(H), and very high (VH).
The membership functions are shown on Figure 2 a), and b). As there is not criterion for the membership functions selection, all of them have been taken similarly, only taking into consideration the different units. Table 6 shows the Compatibility Index calculated for each network.
From the table can be seen that the best option as considering the given conditions is the network A3.
Table 5. Desired condition for each attribute for each decision maker
Attributes | Decision Makers | ||
---|---|---|---|
DM1 | DM2 | DM3 | |
RSS (C1) | H | M | M |
Velocity (C2) | VH | H | H |
Cost(C3) | M | L | VL |
Table 6. Compatibility Indexes
DM | A1 | A2 | A3 | A4 |
---|---|---|---|---|
DM1 | 0.102 | 0 | 0.123 | 0 |
DM2 | 0 | 0.027 | 0 | 0.122 |
DM3 | 0.075 | 0 | 0.092 | 0 |
Σ | 0.177 | 0.027 | 0.215 | 0.122 |
7.Conclusions
Several methods are developed for the classification with imprecise or missing information. The lack of data can be attributed to sensors failure, or simple due to incomplete system information. At last, diffuse values could be related to noise or imprecise measurements from sensors. It is not possible to affirm that one method is always better than other. In any case, the selection using fuzzy logic is simply, and extremely useful when dealing with large databases. The rough sets and information theory permit minimize the number of attributes. Both methods are based in the definition of the misclassification rates (α0L). For any two classes, if this index is bigger than the defined permissible misclassification rate between classes α, the discrimination between these two classes is not possible. Fuzzy logic permits to obtain results from models created using MATLAB or other specialized software.
The first used example is the abalone database, in the analysis it is taken the dimensions only. In general as per the obtained results in the example, the conclusion could be that the selected attributes in this case are not always useful for determining the abalone age, using intervals of 5 years. Using fuzzy logic, it has been made the differentiation between groups I and II, but was not possible between II and III.
The network selected in the second example, is A3. The fundamental object of this example was to
show the simplicity of the method. For obtaining trusted results, it is necessary to make a complete analysis of the used membership functions, which can be done having a group of experts for solving this task.
References
- Y. Leung, et Al., “A Rough Set Approach for the Discovery of Classification Rules in Intervalvalued Information Systems”, Int’l J. of Approximate Reasoning, No. 47, 2007, pp. 233246.
- Hem Jyotsana Parashar, et Al., “An Efficient Classification Approach for Data Mining” International J. of Machine Learning and Computing, Vol. 2, No. 4, August 2012.
- S. M. Gorade et Al., “A Study of Some Data Mining Classification Techniques” International Research J.of Engineering and Technology (IRJET), Vol. 4 April 2017.
- A. Caballero et Al. “Proposed method and Program for Classification of Information Systems”, Proceedings from: 11th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (CSECS’12), Montreux, Switzerland, December 29-31, 2012
- K. Yen, et Al, “Practical solution for the Classification in interval-Valued. Information Systems”. WSEAS Transactions on Systems and Control, Issue 9, Vol. 5, September 2010. pp 735 744 .
- A.Caballero et Al., “Method for Optimizing the Number and Precision of Interval-Valued Parameters in a Multi-Object System”, Proceedings from: " 15th International Conference on Telecommunications and Informatics (TELEINFO’16)", Ischia Italy, June 17-19, 2016.
- J.T. Ross, Fuzzy Logic with Engineering Applications, John Wiley and Sons, Inc., 2004.
- E.D. Cox, “Fuzzy Logic for Business and Industry” Rockland: Charles River Media Publisher, 1995.
- Asuncion, A. & Newman, D.J. (2007). UCI Machine Learning Repository http://www.ics.uci.edu/ mlearn/MLRepository.html]. Irvine, CA: University of California, School of Information and Computer Science.
- M. Ramalingam, “Vertical Handoff Target Selection in a Heterogeneous Wireless Network Using Fuzzy Electre” Thesis for the Degree of Master of Science in Electrical Engineering, Florida International University, 2015.
a)
b)
Fig. 1 Membership functions for Abalone Groups I a) and II b)
a)
b)
Fig. 2
a) Membership functions for RSS (dbm)
b) Velocity. Membership Functions, (Mbps): 60, 65, 70, 75, 80, 85, 90, 95, 100
Cost Membership Functions, ($): 20, 25; 30, 35, 40, 45, 50, 55, 60
References (11)
- Y. Leung, et Al., "A Rough Set Approach for the Discovery of Classification Rules in Interval- valued Information Systems", Int'l J. of Approximate Reasoning, No. 47, 2007, pp. 233- 246.
- Hem Jyotsana Parashar, et Al., "An Efficient Classification Approach for Data Mining" International J. of Machine Learning and Computing, Vol. 2, No. 4, August 2012.
- S. M. Gorade et Al., "A Study of Some Data Mining Classification Techniques" International Research J.of Engineering and Technology (IRJET), Vol. 4 April 2017.
- A. Caballero et Al. "Proposed method and Program for Classification of Information Systems", Proceedings from: 11th WSEAS International Conference on Circuits, Systems, Electronics, Control & Signal Processing (CSECS'12), Montreux, Switzerland, December 29-31, 2012
- K. Yen, et Al, "Practical solution for the Classification in interval-Valued. Information Systems". WSEAS Transactions on Systems and Control, Issue 9, Vol. 5, September 2010. pp 735 - 744.
- A.Caballero et Al., "Method for Optimizing the Number and Precision of Interval-Valued Parameters in a Multi-Object System", Proceedings from: "15 th International Conference on Telecommunications and Informatics (TELE- INFO'16)", Ischia Italy, June 17-19, 2016.
- J.T. Ross, Fuzzy Logic with Engineering Applications. John Wiley and Sons, Inc., 2004.
- E.D. Cox, "Fuzzy Logic for Business and Industry" Rockland: Charles River Media Publisher, 1995.
- Asuncion, A. & Newman, D.J. (2007). UCI Machine Learning Repository http://www.ics.uci.edu/~mlearn/MLRepository.html].
- Irvine, CA: University of California, School of Information and Computer Science.
- M. Ramalingam, "Vertical Handoff Target Selection in a Heterogeneous Wireless Network Using Fuzzy Electre" Thesis for the Degree of Master of Science in Electrical Engineering, Florida International University, 2015.