mirror of
https://github.com/tutao/tutanota.git
synced 2025-12-08 06:09:50 +00:00
improve inbox rule handling and run spam prediction after inbox rules
Instead of applying inbox rules based on the unread mail state in the inbox folder, we introduce the new ProcessingState enum on the mail type. If a mail has been processed by the leader client, which is checking for matching inbox rules, the ProcessingState is updated. If there is a matching rule the flag is updated through the MoveMailService, if there is no matching rule, the flag is updated using the ClientClassifierResultService. Both requests are throttled / debounced. After processing inbox rules, spam prediction is conducted for mails that have not yet been moved by an inbox rule. The ProcessingState for not matching ham mails is also updated using the ClientClassifierResultService. This new inbox rule handing solves the following two problems: - when clicking on a notification it could still happen, that sometimes the inbox rules where not applied - when the inbox folder had a lot of unread mails, the loading time did massively increase, since inbox rules were re-applied on every load Co-authored-by: amm <amm@tutao.de> Co-authored-by: Nick <nif@tutao.de> Co-authored-by: das <das@tutao.de> Co-authored-by: abp <abp@tutao.de> Co-authored-by: jhm <17314077+jomapp@users.noreply.github.com> Co-authored-by: map <mpfau@users.noreply.github.com> Co-authored-by: Kinan <104761667+kibibytium@users.noreply.github.com>
This commit is contained in:
parent
030bea4fe6
commit
f11e59672e
53 changed files with 1269 additions and 1010 deletions
|
|
@ -1,12 +1,7 @@
|
|||
import o from "@tutao/otest"
|
||||
import { HashingVectorizer } from "../../../../../../src/mail-app/workerUtils/spamClassification/HashingVectorizer"
|
||||
import { arrayEquals } from "@tutao/tutanota-utils"
|
||||
|
||||
export const tokenize = (text: string): string[] =>
|
||||
text
|
||||
.toLowerCase()
|
||||
.split(/\s+/)
|
||||
.filter((t) => t.length > 1)
|
||||
import { spamClassifierTokenizer } from "../../../../../../src/mail-app/workerUtils/spamClassification/SpamClassifier"
|
||||
|
||||
o.spec("HashingVectorizer", () => {
|
||||
const rawDocuments = [
|
||||
|
|
@ -17,7 +12,7 @@ o.spec("HashingVectorizer", () => {
|
|||
"Millions of people choose Tuta to protect their personal and professional communication.",
|
||||
]
|
||||
|
||||
const tokenizedDocuments = rawDocuments.map(tokenize)
|
||||
const tokenizedDocuments = rawDocuments.map(spamClassifierTokenizer)
|
||||
|
||||
o("vectorize creates same vector for same tokens", async () => {
|
||||
const vectorizer = new HashingVectorizer()
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue