You are on page 1of 35

A Fuzzy Associative Rulebased Approach for Pattern Mining and Pattern-based Classification

Ashish Mangalampalli Advisor: Dr. Vikram Pudi Centre for Data Engineering International Institute of Information Technology (IIIT) Hyderabad
1

Outline

Introduction
Crisp and Fuzzy Associative Classification Pre-Processing and Mining

Fuzzy Pre-Processing FPrep Fuzzy ARM FAR-Miner and FAR-HD

Associative Classification Our Approach


FACISME Fuzzy Adaption of ACME (Maximum Entropy Associative Classifier) Simple and Effective Associative Classifier (SEAC) Fuzzy Simple and Effective Associative Classifier (FSEAC)

Associative Classification Applications


Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC) Associative Classifier for Ad-targeting

Conclusions
2

Introduction

Associative classification

Mines huge amounts of data Integrates Association Rule Mining (ARM) with Classification
A = a, B = b, C = c X = x

Associative classifiers have several advantages

Frequent itemsets capture dominant relationships between items/features Statistically significant associations make classification framework robust Low-frequency patterns (noise) are eliminated during ARM Rules are very transparent and easily understood

Unlike black-box-like approach used in popular classifiers, such as SVMs and Artificial Neural Networks

Outline

Introduction

Crisp and Fuzzy Associative Classification


Pre-Processing and Mining

Fuzzy Pre-Processing FPrep Fuzzy ARM FAR-Miner and FAR-HD

Associative Classification Our Approach


Simple and Effective Associative Classifier (SEAC) Fuzzy Simple and Effective Associative Classifier (FSEAC)

Associative Classification Applications

Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC) Associative Classifier for Ad-targeting

Conclusions
4

Crisp Associative Classification

Most associative classifiers are crisp


Most real-life datasets contain binary and numerical attributes Use sharp partitioning Transform numerical attributes to binary ones, e.g. Income = [100K and above]

Drawbacks of sharp partitioning


Introduces uncertainty, especially at partition boundaries Small changes in intervals lead to misleading results Gives rise to polysemy and synonymy Intervals do not generally have clear semantics associated

For example, sharp partitions for the attribute Income



5

Up to 20K, 20K-100K, 100K and above Income = 50K would fit in the second partition But, so would Income = 99K

Fuzzy Associative Classification

Fuzzy logic

Used to convert numerical attributes to fuzzy attributes (e.g. Income = High) Maintains integrity of information conveyed by numerical attributes Attribute values belong to partitions with some membership - interval [0, 1]

Outline

Introduction Crisp and Fuzzy Associative Classification

Pre-Processing and Mining


Fuzzy Pre-Processing FPrep Fuzzy ARM FAR-Miner and FAR-HD

Associative Classification Our Approach


Simple and Effective Associative Classifier (SEAC) Fuzzy Simple and Effective Associative Classifier (FSEAC)

Associative Classification Applications

Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC) Associative Classifier for Ad-targeting

Conclusions
7

Pre-Processing and Mining

Fuzzy pre-processing

Convert crisp dataset (binary and numerical attributes) into fuzzy dataset (binary and fuzzy attributes) FPrep Algorithm used

Efficient and robust Fuzzy ARM algorithms

Web-scale datasets mandate such algorithms Fuzzy Apriori is most popular Many efficient crisp ARM algorithms exist like ARMOR and FP-Growth Algorithms used

FAR-Miner for normal transactional datasets FAR-HD for high dimensional datasets

Outline

Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining


Fuzzy Pre-Processing FPrep Fuzzy ARM FAR-Miner and FAR-HD

Associative Classification Our Approach


Simple and Effective Associative Classifier (SEAC) Fuzzy Simple and Effective Associative Classifier (FSEAC)

Associative Classification Applications


Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC) Associative Classifier for Ad-targeting

Conclusions

13

Associative Classification Our Approach

AC algorithms like CPAR and CMAR only mine frequent itemsets


Processed using additional (greedy) algorithms like FOIL and PRM Overhead in running time; process more complex

Association rules directly used for training and scoring

Exhaustive approach

Controlled by appropriate support Not a time-intensive process

Rule pruning and ranking take care of huge volume and redundancy

Classifier built in a two-phased manner



14

Global rule-mining and training Local rule-mining and training Provides better accuracy and representation/coverage

Associative Classification Our Approach (contd)

Pre-processing to generate fuzzy dataset (for fuzzy associative classifiers) using FPrep Classification Association Rules (CARs) mining using FAR-Miner or FAR-HD CARs pruning and classifier training using SEAC or FSEAC Rule ranking and application (scoring) techniques
15

Simple and Effective Associative Classifier (SEAC)

Direct mining of CARs faster and simpler training CARs used directly through effective pruning and sorting Pruning and rule-ranking based on

Information gain Rule-length

Two-phased manner

Global rule-mining and training Local rule-mining and training

16

SEAC - Example

Example Dataset
Scoring Example Unlabeled: B=2, C=2 X=1 16, 17, 19 (IG=0.534) X=2 13, 14, 20 (IG=0.657)

Ruleset
17

Fuzzy Simple and Effective Associative Classifier (FSEAC)


Amalgamates Fuzzy Logic with Associative Classification

Pre-processed using FPreP


CARs mined using FAR-Miner / FAR-HD

CARs pruned based on Fuzzy Information Gain (FIG) and rule length - no sorting required
Scoring rules applied taking into account

Sorting done then Final score computed

18

FSEAC - Example

Format for Fuzzy Version of Dataset

Example Dataset
19

Fuzzy Version of Example Dataset

FSEAC Example (contd)

Ruleset
20

SEAC and FSEAC Experimental Setup

SEAC

12 classifiers (Associative and non-associative) 14 UCI ML datasets 100-5000 records per dataset 2-10 classes per dataset Up to 20 features per dataset 10-fold Cross Validation

FSEAC

17 classifiers (Associative and non-associative; fuzzy and crisp) 23 UCI ML datasets 100-5000 records per dataset 2-10 classes per dataset Up to 60 features per dataset 10-fold Cross Validation

21

SEAC Results (10 fold-CV)

continued

22

SEAC - Results (10 fold-CV)

23

FSEAC - Results (10 fold-CV)

continued 24

FSEAC - Results (10 fold-CV)

25

Outline

Introduction

Crisp and Fuzzy Associative Classification Pre-Processing and Mining


Fuzzy Pre-Processing FPrep Fuzzy ARM FAR-Miner and FAR-HD

Associative Classification Our Approach


Simple and Effective Associative Classifier (SEAC) Fuzzy Simple and Effective Associative Classifier (FSEAC)

Associative Classification Applications


Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC) Associative Classifier for Ad-targeting

Conclusions
26

Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC)

Adapts fuzzy associative classification for Object Class Detection in images


Speeded-Up Robust Features (SURF) - interest point detector and descriptor for images Fuzzy clusters used as opposed to hard clustering used in Bagof-words

Only positive class (CP) examples used for mining

Negative class (CN) in object class detection is very vague

CN = U CP

Rules are pruned and ranked based on Information Gain


Other AC algorithms use third-party algorithms for rulegeneration from frequent itemsets Top k rules are used for scoring and classification
ICPR 2010

27

I-FAC

SURF points extracted from positive class images


FCM applied to derive clusters Clusters (with s) used to generate dataset for mining

100 fuzzy clusters as opposed to1000-2000 crisp clusters-based algorithms

ARM generates Classification Association Rules (CARs) associated with positive class CARs are pruned and sorted using

Fuzzy Information Gain (FIG) of each rule Length of each rule i.e. number of attributes in each rule

Scoring based on rule-match and FIG


28 ICPR 2010

I-FAC - Performance Study

Performs well when compared to BOW or SVM

Very well at low FPRs (0.3)

Fuzzy nature helps avoid polysemy and synonymy Uses only positive class for training

30

ICPR 2010

Visual Concept Detection on MIR Flickr


Revamped version of I-FAC Multi-class detection


38 visual concepts e.g. car, sky, clouds, water, building, sea, face

Experimental evaluation

First 10K images of MIR Flick dataset AUC values for each concept

31

Experimental Results (3-fold CV)

continued 32

Experimental Results (3-fold CV)

33

Look-alike Modeling using Feature-Pairbased Associative Classification

Display-ad targeting currently done using methods which rely on publisher-defined segments like Behavior-targeting (BT) Look-alike model trained to identify similar users

Similarity is based on historical user behavior Model iteratively rebuilt as more users are added Advertiser supplies seed list of users

Approach for building advertiser specific audience segments


Complements publisher defined segments such as BT Provides advertisers control over the audience definition

Given a list of target users (e.g., people who clicked or converted on a particular category or ad campaign), find other similar users.
34 WWW 2011

Look-alike Modeling using Feature-Pairbased Associative Classification contd

Enumerate all feature-pairs in training set occurring in at least 5 positive-class records

Feature-pairs modelled as AC rules Only rules for positive class used Works well in Tail Campaigns

Affinity measured by Frequency-weighted LLR (F-LLR)


FLLR = P(f) log(P(f | conv) / P(f | non-conv)) Rules sorted in descending order by F-LLRs

Scoring - Top k rules are applied

35

Cumulative score from all rules used for classification


WWW 2011

Performance Study

Two pilot campaigns


300K records each One record per user Training window - 14 days Scoring window - seven days

Baseline
Random Targeting Linear SVM GBDT

Lift (Conversion Rate) 82% 301% 100%

Lift (AUC)
11% 2%

Works very well for Tail Campaigns

Results on a Tail Campaign


Baseline Random Targeting Linear SVM GBDT Lift (Conversion Rate) 48% Lift (AUC)

Can find meaningful associations in extremely sparse and skewed data

SVM and GBDT work well for Head Campaigns


36

-12% -40%

-6% -14%

Results on a Head Campaign


WWW 2011

Outline

Introduction Crisp and Fuzzy Associative Classification Pre-Processing and Mining


Fuzzy Pre-Processing FPrep Fuzzy ARM FAR-Miner and FAR-HD

Associative Classification Our Approach


Simple and Effective Associative Classifier (SEAC) Fuzzy Simple and Effective Associative Classifier (FSEAC) Efficient Fuzzy Associative Classifier for Object Classes in Images (I-FAC) Associative Classifier for Ad-targeting

Associative Classification Applications


Conclusions
37

Conclusions

Fuzzy pre-processing for dataset transformation

Fuzzy ARM for various types of datasets


Fuzzy and Crisp Associative Classifiers for various domains

Customizations required for different domains


Pre-processing Pruning Rule ranking techniques Rule application (scoring) techniques

38

References

Ashish Mangalampalli, Adwait Ratnaparkhi, Andrew O. Hatch, Abraham Bagherjeiran, Rajesh Parekh, and Vikram Pudi. A Feature-Pair-based Associative Classification Approach to Look-alike Modeling for Conversion-Oriented User-Targeting in Tail Campaigns. In International World Wide Web Conference (WWW), 2011. Ashish Mangalampalli, Vineet Chaoji, and Subhajit Sanyal. I-FAC: Efficient fuzzy associative classifier for object classes in images. In International Conference on Pattern Recognition (ICPR), 2010. Ashish Mangalampalli and Vikram Pudi. FPrep: Fuzzy clustering driven efficient automated pre-processing for fuzzy association rule mining. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2010. Ashish Mangalampalli and Vikram Pudi. FACISME: Fuzzy associative classification using iterative scaling and maximum entropy. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2010. Ashish Mangalampalli and Vikram Pudi. Fuzzy Association Rule Mining Algorithm for Fast and Efficient Performance on Very Large Datasets. In IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), 2009.

39

Thank You, and Questions

40

You might also like