FST of finite state machine

Today I saw an article about lucene using finite state machines, http://www.cnblogs.com/LBSer/p/4119841.html At first, I thought they were very similar to trie trees, and then I found that they were different:
trie tree is a tree structure, each child node has only one parent node, while FST is a mesh structure, each child node can have more than one parent node, so FST saves more space.

I also saw an article which also mentioned the application of FSM
https://www.cnblogs.com/dreamroute/p/8484457.html

There is also a variant of the effective state machine, DFA, full name: Deterministic Finite Automaton, that is, Deterministic Finite Automaton. Can be used to filter sensitive words

The code is as follows:

public class All {

    private Map sensitiveWordMap = null;

    private Map addSensitiveWordToHashMap(Set<String> wordSet) {

        // Initialize sensitive word container to reduce expansion operation
        Map wordMap = new HashMap(wordSet.size());

        for (String word : wordSet) {
            Map nowMap = wordMap;
            for (int i = 0; i < word.length(); i++) {

                // Convert to char
                char keyChar = word.charAt(i);

                // Obtain
                Object tempMap = nowMap.get(keyChar);

                // If the key exists, assign it directly
                if (tempMap != null) {
                    nowMap = (Map) tempMap;
                }

                // If it does not exist, build a map and set isEnd to 0, because it is not the last one
                else {

                    // Set flag bit
                    Map<String, String> newMap = new HashMap<String, String>();
                    newMap.put("isEnd", "0");

                    // Add to collection
                    nowMap.put(keyChar, newMap);
                    nowMap = newMap;
                }

                // The last one
                if (i == word.length() - 1) {
                    nowMap.put("isEnd", "1");
                }
            }
        }

        return wordMap;
    }

     public Set<String> getSensitiveWord(String txt, int matchType) {
        Set<String> sensitiveWordList = new HashSet<String>();

        for (int i = 0; i < txt.length(); i++) {

            // Determine whether sensitive characters are included
            int length = CheckSensitiveWord(txt, i, matchType);

            // Exist, add to list
            if (length > 0) {
                sensitiveWordList.add(txt.substring(i, i + length));

                // The reason for minus 1 is that for will increase automatically
                i = i + length - 1;
            }
        }

        return sensitiveWordList;
    }

    /**
     * Replace sensitive characters
     * 
     * @param txt
     * @param matchType
     * @param replaceChar
     * @return
     */
    public String replaceSensitiveWord(String txt, int matchType, String replaceChar) {

        String resultTxt = txt;

        // Get all sensitive words
        Set<String> set = getSensitiveWord(txt, matchType);
        Iterator<String> iterator = set.iterator();
        String word = null;
        String replaceString = null;
        while (iterator.hasNext()) {
            word = iterator.next();
            replaceString = getReplaceChars(replaceChar, word.length());
            resultTxt = resultTxt.replaceAll(word, replaceString);
        }

        return resultTxt;
    }

    /**
     * Get replacement string
     * 
     * @param replaceChar
     * @param length
     * @return
     */
    private String getReplaceChars(String replaceChar, int length) {
        String resultReplace = replaceChar;
        for (int i = 1; i < length; i++) {
            resultReplace += replaceChar;
        }

        return resultReplace;
    }

    /**
     * Check whether sensitive characters are included in the text. The rules are as follows: < br >
     * If it exists, the length of the sensitive word character is returned; if it does not exist, 0 is returned
     * 
     * @param txt
     * @param beginIndex
     * @param matchType
     * @return
     */
    public int CheckSensitiveWord(String txt, int beginIndex, int matchType) {

        // End marker of sensitive words: used when there is only one sensitive word
        boolean flag = false;

        // The number of matching identities is 0 by default
        int matchFlag = 0;
        Map nowMap = this.sensitiveWordMap;
        for (int i = beginIndex; i < txt.length(); i++) {
            char word = txt.charAt(i);

            // Get the specified key
            nowMap = (Map) nowMap.get(word);

            // If it exists, judge whether it is the last one
            if (nowMap != null) {

                // Find the corresponding key, match ID + 1
                matchFlag++;

                // If it is the last matching rule, end the cycle and return the number of matching IDS
                if ("1".equals(nowMap.get("isEnd"))) {

                    // End flag bit is true
                    flag = true;

                    // Minimum rule, direct return, maximum rule still need to be searched
                    if (1 == matchType) {
                        break;
                    }
                }
            }

            // No, return directly
            else {
                break;
            }
        }
        // Length must be greater than or equal to 1, which is a word
        if (matchFlag < 2 || !flag) {
            matchFlag = 0;
        }
        return matchFlag;
    }

    private Set<String> readSensitiveWordFile() {

        Set<String> wordSet = new HashSet<String>(); 
        wordSet.add("×××");
        wordSet.add("×××");
        wordSet.add("Bastard");
        wordSet.add("Son of a bitch");
        return wordSet;
    }

    public static void main(String[] args) {

        All all=new All();
        all.sensitiveWordMap=all.addSensitiveWordToHashMap(all.readSensitiveWordFile());
        String txt = "×××";
        String hou = all.replaceSensitiveWord(txt, 1, "*");
        System.out.println("The text before the replacement is:" + txt);
        System.out.println("The replaced text is:" + hou);
    }

}

I think this tree is very similar to the trie tree written before. DFA seems to be a tree, not a mesh.

Tags: Big Data

Posted on Sun, 01 Dec 2019 00:16:37 -0800 by ronnie88