Tomoharu Mitsuhashi


2017

pdf bib
Patent NMT integrated with Large Vocabulary Phrase Translation by SMT at WAT 2017
Zi Long | Ryuichiro Kimura | Takehito Utsuro | Tomoharu Mitsuhashi | Mikio Yamamoto
Proceedings of the 4th Workshop on Asian Translation (WAT2017)

Neural machine translation (NMT) cannot handle a larger vocabulary because the training complexity and decoding complexity proportionally increase with the number of target words. This problem becomes even more serious when translating patent documents, which contain many technical terms that are observed infrequently. Long et al.(2017) proposed to select phrases that contain out-of-vocabulary words using the statistical approach of branching entropy. The selected phrases are then replaced with tokens during training and post-translated by the phrase translation table of SMT. In this paper, we apply the method proposed by Long et al. (2017) to the WAT 2017 Japanese-Chinese and Japanese-English patent datasets. Evaluation on Japanese-to-Chinese, Chinese-to-Japanese, Japanese-to-English and English-to-Japanese patent sentence translation proved the effectiveness of phrases selected with branching entropy, where the NMT model of Long et al.(2017) achieves a substantial improvement over a baseline NMT model without the technique proposed by Long et al.(2017).

pdf bib
Comparison of SMT and NMT trained with large Patent Corpora: Japio at WAT2017
Satoshi Kinoshita | Tadaaki Oshio | Tomoharu Mitsuhashi
Proceedings of the 4th Workshop on Asian Translation (WAT2017)

Japio participates in patent subtasks (JPC-EJ/JE/CJ/KJ) with phrase-based statistical machine translation (SMT) and neural machine translation (NMT) systems which are trained with its own patent corpora in addition to the subtask corpora provided by organizers of WAT2017. In EJ and CJ subtasks, SMT and NMT systems whose sizes of training corpora are about 50 million and 10 million sentence pairs respectively achieved comparable scores for automatic evaluations, but NMT systems were superior to SMT systems for both official and in-house human evaluations.

pdf bib
Neural Machine Translation Model with a Large Vocabulary Selected by Branching Entropy
Zi Long | Ryuichiro Kimura | Takehito Utsuro | Tomoharu Mitsuhashi | Mikio Yamamoto
Proceedings of Machine Translation Summit XVI: Research Track

2016

pdf bib
Translation of Patent Sentences with a Large Vocabulary of Technical Terms Using Neural Machine Translation
Zi Long | Takehito Utsuro | Tomoharu Mitsuhashi | Mikio Yamamoto
Proceedings of the 3rd Workshop on Asian Translation (WAT2016)

Neural machine translation (NMT), a new approach to machine translation, has achieved promising results comparable to those of traditional approaches such as statistical machine translation (SMT). Despite its recent success, NMT cannot handle a larger vocabulary because training complexity and decoding complexity proportionally increase with the number of target words. This problem becomes even more serious when translating patent documents, which contain many technical terms that are observed infrequently. In NMTs, words that are out of vocabulary are represented by a single unknown token. In this paper, we propose a method that enables NMT to translate patent sentences comprising a large vocabulary of technical terms. We train an NMT system on bilingual data wherein technical terms are replaced with technical term tokens; this allows it to translate most of the source sentences except technical terms. Further, we use it as a decoder to translate source sentences with technical term tokens and replace the tokens with technical term translations using SMT. We also use it to rerank the 1,000-best SMT translations on the basis of the average of the SMT score and that of the NMT rescoring of the translated sentences with technical term tokens. Our experiments on Japanese-Chinese patent sentences show that the proposed NMT system achieves a substantial improvement of up to 3.1 BLEU points and 2.3 RIBES points over traditional SMT systems and an improvement of approximately 0.6 BLEU points and 0.8 RIBES points over an equivalent NMT system without our proposed technique.

pdf bib
Translation Using JAPIO Patent Corpora: JAPIO at WAT2016
Satoshi Kinoshita | Tadaaki Oshio | Tomoharu Mitsuhashi | Terumasa Ehara
Proceedings of the 3rd Workshop on Asian Translation (WAT2016)

We participate in scientific paper subtask (ASPEC-EJ/CJ) and patent subtask (JPC-EJ/CJ/KJ) with phrase-based SMT systems which are trained with its own patent corpora. Using larger corpora than those prepared by the workshop organizer, we achieved higher BLEU scores than most participants in EJ and CJ translations of patent subtask, but in crowdsourcing evaluation, our EJ translation, which is best in all automatic evaluations, received a very poor score. In scientific paper subtask, our translations are given lower scores than most translations that are produced by translation engines trained with the in-domain corpora. But our scores are higher than those of general-purpose RBMTs and online services. Considering the result of crowdsourcing evaluation, it shows a possibility that CJ SMT system trained with a large patent corpus translates non-patent technical documents at a practical level.

2015

pdf bib
Evaluating Features for Identifying Japanese-Chinese Bilingual Synonymous Technical Terms from Patent Families
Zi Long | Takehito Utsuro | Tomoharu Mitsuhashi | Mikio Yamamoto
Proceedings of the Eighth Workshop on Building and Using Comparable Corpora

pdf bib
Collecting bilingual technical terms from patent families of character-segmented Chinese sentences and morpheme-segmented Japanese sentences
Zi Long | Takehito Utsuro | Tomoharu Mitsuhashi | Mikio Yamamoto
Proceedings of the 6th Workshop on Patent and Scientific Literature Translation