What's the difference between multi label classification and fuzzy classification? 2019...
Are there existing rules/lore for MTG planeswalkers?
Processing ADC conversion result: DMA vs Processor Registers
Retract an already submitted Recommendation Letter (written for an undergrad student)
Is it OK if I do not take the receipt in Germany?
Is there an efficient way for synchronising audio events real-time with LEDs using an MCU?
What do you call an IPA symbol that lacks a name (e.g. ɲ)?
How did Elite on the NES work?
false 'Security alert' from Google - every login generates mails from 'no-reply@accounts.google.com'
Co-worker works way more than he should
What is a 'Key' in computer science?
In search of the origins of term censor, I hit a dead end stuck with the greek term, to censor, λογοκρίνω
How would you suggest I follow up with coworkers about our deadline that's today?
What *exactly* is electrical current, voltage, and resistance?
What is /etc/mtab in Linux?
What's the difference between using dependency injection with a container and using a service locator?
/bin/ls sorts differently than just ls
What is the evidence that custom checks in Northern Ireland are going to result in violence?
Will I lose my paid in full property
TV series episode where humans nuke aliens before decrypting their message that states they come in peace
When speaking, how do you change your mind mid-sentence?
Determinant of a matrix with 2 equal rows
Why did Israel vote against lifting the American embargo on Cuba?
How to translate "red flag" into Spanish?
Is a self contained air-bullet cartridge feasible?
What's the difference between multi label classification and fuzzy classification?
2019 Community Moderator Election Results
2019 Moderator Election Q&A - QuestionnaireMachine Learning - Where is the difference between one-class, binary-class and multinominal-class classification?Machine Learning - Where is the difference between one-class, binary-class and multinominal-class classification?How to use binary relevance for multi-label text classification?Large Numpy.Array for Multi-label Image Classification (CelebA Dataset)How to classify a dataset into 5 classes even though the performance is low?multi class classification : unbalanced data - good testing results poor prediction resultsHow to visualize results/errors of multilabel classifiers?Training multi-label classifier with unbalanced samples in KerasAppropriate math for evaluating coverage/fit across multiple weighted many-to-many relationshipsHow to optimize function built on top of the classifier?
$begingroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
$endgroup$
add a comment |
$begingroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
$endgroup$
add a comment |
$begingroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
$endgroup$
Is it just the between academics and practitioners in term usage?
Or is theoretical difference of how we consider each sample: as belonging to multiple classes at once or to one fuzzy class?
Or this distinction has some practical meaning of how we build model for classification?
classification multilabel-classification fuzzy-logic fuzzy-classification
classification multilabel-classification fuzzy-logic fuzzy-classification
asked 12 hours ago
DmytroSytroDmytroSytro
1808
1808
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow {0, 1}^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_{k=1}^{K} p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_{k=1}^{K} p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "557"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49780%2fwhats-the-difference-between-multi-label-classification-and-fuzzy-classificatio%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow {0, 1}^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_{k=1}^{K} p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_{k=1}^{K} p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
add a comment |
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow {0, 1}^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_{k=1}^{K} p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_{k=1}^{K} p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
add a comment |
$begingroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow {0, 1}^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_{k=1}^{K} p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_{k=1}^{K} p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
$endgroup$
Multi-label classification (Wiki):
Given $K$ classes, find a map $f:X rightarrow {0, 1}^K$.
Fuzzy classification (a good citation is needed!):
Given $K$ classes, find a map $p: X rightarrow [0, 1]^K$ where $sum_{k=1}^{K} p(k)=1$.
In multi-label classification, as defined, there is no "resource limit" on classes compared to fuzzy classification.
For example, a neural network with a softmax layer does fuzzy classification (soft classification). If we only select a class with the highest score, then it will become a single-label classification (hard classification), and if we select top $k$ classes, it will be a multi-label classification (again hard classification).
Fuzzy classification: [0.5, 0.2, 0.3, 0, 0]
Single-label classification: [1, 0, 0, 0, 0]
Multi-label classification: [1, 0, 1, 0, 0]
As another example for multi-label classification, we could have $K$ neural networks for $K$ classes with sigmoid outputs, and assign a point to class $k$ if output of network $k$ is higher than 0.5.
Outputs: [0.6, 0.1, 0.6, 0.9, 0.2]
Multi-label classification: [1, 0, 1, 1, 0]
Practical considerations
As demonstrated in the examples, the key difference is the "resource limit" that exists in fuzzy classification but not in multi-label classification. Including the limit (in the first example), or ignoring it (in the second example) depends on the task. For example, in a classification task that has mutually exclusive labels, we want to include the "resource limit" to impose the "mutually exclusive" assumption on the model.
Note that the $sum_{k=1}^{K} p(k)=1$ restriction in fuzzy classification is merely a "definition", there is no point in arguing about a definition. We can either propose another classification, or argue when to use - and when not to use - such classification.
edited 9 hours ago
answered 11 hours ago
EsmailianEsmailian
3,736420
3,736420
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
add a comment |
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
Hmm, but I thought that is the point of multi-label classification not to use softmax, because classes don't exclude each other.
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
@DmytroSytro you are right I added another example.
$endgroup$
– Esmailian
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
So, does it really matter for fuzzy set that sum of probabilities for all classes equals to 1?
$endgroup$
– DmytroSytro
10 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
@DmytroSytro I've added a section to explain when sum=1 restriction is useful.
$endgroup$
– Esmailian
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
$begingroup$
Thank you! Still, it kind of confuses me that in fuzzy sets there should be limit on the sum of probabilities, so that sample can't belong to different sets with probability 1 for each set. As I understand it's the result of the truth function in fuzzy logic that can't assign sum of probabilities more than 1.
$endgroup$
– DmytroSytro
9 hours ago
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
add a comment |
$begingroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
$endgroup$
A multi label classifier learns to predict class labels using some algorithm and training data. It learns to associate an object's label with some vector containing values for the features. It estimates the probability of a sample belonging to a certain class, based on some condition.
Fuzzy classifiers do the same exact thing, except, it uses fuzzy logic to determine which class a sample belongs to. The data would need to be described using linguistic rules as opposed to the data used by a conventional classifier. When classifying a sample, it would return a "degree of membership" to each class.
answered 11 hours ago
SterlsSterls
1285
1285
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
add a comment |
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
So, fuzzy classifiers aren't machine learning?
$endgroup$
– DmytroSytro
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
$begingroup$
They definitely are. They are both affected by the mathematical model used to describe the problem. There will be a difference in the design of the model when considering fuzzy vs conventional, but it all stems from the mathematical model nonetheless. The last point where I wrote "degree of membership" can be thought to be synonymous with "probability of a sample being associated with a certain label".
$endgroup$
– Sterls
11 hours ago
add a comment |
Thanks for contributing an answer to Data Science Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdatascience.stackexchange.com%2fquestions%2f49780%2fwhats-the-difference-between-multi-label-classification-and-fuzzy-classificatio%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown