HAN TE SUN

Emeritus Professor etc.Emeritus Professor

Degree

  • 博士(工学), 東京大学

Educational Background

  • Mar. 1971
    The University of Tokyo, Graduate School, Division of Engineering, 応用物理学専攻(数理工学)
  • Mar. 1964
    The University of Tokyo, Faculty of Engineering, Department of Mathematical Engineering anf Instrument Physics

Member History

  • 2002
    顧問, 情報理論とその応用学会, Society
  • 2000 - 2001
    会長, 情報理論とその応用学会, Society
  • 1998 - 2000
    副会長, 情報理論とその応用学会, Society
  • 1994 - 1997
    Shannon Theory Editor for IEEE Trans. on Information Theory, IEEE Information Theory Society, Society
  • 1996 - 1996
    情報理論専門委員,情報理論専門委員長, 電子情報通信学会, Society
  • 1994 - 1996
    理事, Society
  • 1981 - 1991
    編集理事・企画理事, 情報理論とその応用学会, Society

Award

  • May 2004
    電子情報通信学会 業績賞
  • 2002
    Japan
    IEICE Fellow
    Japan
  • 2000
    電気通信普及財団よりテレコム技術賞
  • 1998
    著書「情報理論における情報スペクトル的方法」で 大川出版賞受賞
  • 1996
    USA
    Editor Award for Shannon Theory, IEEE-IT
    United States
  • 1990
    IEEE Fellow

Paper

  • Folklore in source coding: Information-spectrum approach
    TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 51, 2, 747-753, Feb. 2005, Peer-reviwed, Information theory has several traditional folklore problems about data compression or channel coding with reference to random number generation problems. Here, we focus on and reasonably formulate one of them from the viewpoint of information spectra. Specifically, we verify the validity of the folklore that the output from any source encoder working at the optimal coding rate with asymptotically vanishing probability of error looks like almost completely random.
    Scientific journal, English
  • Interval algorithm for homophonic coding
    M Hoshi; TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 47, 3, 1021-1031, Mar. 2001, Peer-reviwed, It is shown that the idea of the successive refinement of interval partitions, which plays the key role in the interval algorithm for random number generation by Han and Hoshi, is also applicable to the homophonic coding. An interval algorithm for homophonic coding is introduced which produces an independent and identically distributed (i.i.d.) sequence with probability p, Lower and upper bounds for the expected codeword length are given. Based on this, an interval algorithm for fixed-to-variable homophonic coding is established, The expected codeword length per source letter converges to H(X) / H(p) in probability as the block length tends to infinity, where H(X) is the entropy rate of the source X. The algorithm is asymptotically optimal. An algorithm for fixed-to-fixed homophonic coding is also established. The decoding error probability tends to zero as the block length tends to infinity. Homophonic coding with cost is generally considered. The expected cost of the codeword per source letter converges to (c) over bar (X) / H (p) in probability as the block length tends to infinity, where (c) over bar denotes the verage cost of a source letter. The main contribution of this paper can be regarded as a novel application of Elias' coding technique to homophonic coding. Intrinsic relations among these algorithms, the interval algorithm for random number generation and the arithmetic code are also discussed.
    Scientific journal, English
  • The optimal overflow and underflow probabilities of variable-length coding for the general source
    O.Uchida; T.S.Han
    IEICE Transactions, E84-A, 10, 2457-2465, 2001, Peer-reviwed
    English
  • Hypothesis testing with the general source
    TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 46, 7, 2415-2427, Nov. 2000, Peer-reviwed, The asymptotically optimal hypothesis testing problem, with general sources as the null and alternative hypotheses, is studied under exponential-type error constraints on the first kind of error probability. Our fundamental philosophy is to convert all of the hypothesis testing problems to the pertinent computation problems in the large deviation-probability theory. This methodologically new approach enables us to establish compact general formulas of the optimal exponents of the second kind of error and correct testing probabilities for the general sources including all nonstationary and/or nonergodic sources with arbitrary abstract alphabet (countable or uncountable). These general formulas are presented from the information-spectrum point of view.
    Scientific journal, English
  • The reliability functions of the general source with fixed-length coding
    T. H. Han
    IEEE Transactions on Information Theory, IT-46, 6, 2108-2116, Sep. 2000, Peer-reviwed
    English
  • Theorems on the variable-length intrinsic randomness
    TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 46, 6, 2108-2116, Sep. 2000, Peer-reviwed, In this paper we address variable-length intrinsic randomness problems (in the sense of Vembu and Verdu [1]) for countably infinite source alphabet chi under the (unnormalized) divergence distance, the normalized conditional divergence distance, and the variational distance. It turns out that under all three kinds of approximation measures the variable-length intrinsic randomness still takes the same value, called the inf-entropy rate of the source.
    Scientific journal, English
  • Weak variable-length source coding
    TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 46, 4, 1217-1226, Jul. 2000, Peer-reviwed, Given a general source X = {X-n}(n=1)(infinity), source coding is characterized by a pair (phi(n), psi(n)) of encoder phi(n) and decoder psi(n) together with the probability of error
    epsilon(n) = Pr{psi(n)(phi(n)(X-n)) not equal X-n}.
    If the length of the encoder output phi(n)(X-n) is fixed, then it is called fixed-length source coding, while if the length of the encoder output phi(n)(X-n) is variable, then it is called variable-length source coding. Usually, in the context of fixed-length source coding the probability of error epsilon(n) is required to asymptotically vanish (i.e., lim(n-->infinity) epsilon(n) = 0), whereas in the context of variable-length source coding the probability of error epsilon(n) is required to be exactly zero (i.e., epsilon(n) = 0 For All n = 1, 2,....) In contrast to these, we consider in the present paper the problem of variable-length source coding with asymptotically vanishing probability of error (i.e., lim(n-->infinity) epsilon(n) = 0), and establish several fundamental theorems on this new subject.
    Scientific journal, English
  • Theorems on the variable-length intrinsic reandomness
    T. H. Han
    IEEE Transactions on Information Theory, IT-46, 4, 1217-1226, Jun. 2000, Peer-reviwed
    Scientific journal, English
  • "Source code with cost as a nonuniform random number generation"
    T. S. Han; O. Uchida
    IEEE Transactions on Information Theory, 46, 2, 712-717, Mar. 2000, Peer-reviwed
    Scientific journal, English
  • "かく乱母数を含む場合のMDL基準の構築と空間図形モデル推定問題への応用"
    長尾淳平; 韓 太舜
    電子情報通信学会論文誌, J83-A, 1, 83-95, Jan. 2000, Peer-reviwed
    Scientific journal, Japanese
  • Interval Algorithms for Homophonic Coding,(invited talk)
    M. Hoshi; T. S. Han
    Proceedings of the 1999 IEEE Information Theory and Communications Workshop, Kruger National Park, South Africa, June 20-25, 88-90, Jun. 1999
    English
  • "An Information-spectrum approach to information theory" (special invited lecture)
    T. S. Han
    IEEE Internal Workshop on Information Theory, Kruger Park, South Africa, June 25-30, 1999, Jun. 1999
    English
  • Disjointness of random sequence sets with respect to distinct probability measures
    TS Han; M Hamada
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 45, 2, 756-760, Mar. 1999, It is shown that the set of deterministic random sequences (of symbols from a finite alphabet) with respect to a computable probability measure mu, in Martin-Lof's sense, and the set of deterministic random sequences with respect to another computable probability measure nu are disjoint if mu and nu are different and the measures are either i.i.d. or homogeneous finite-order irreducible Markov measures.
    Scientific journal, English
  • "Weak variable-length source coding" (invited talk)
    T. S. Han
    IEEE Internal Workshop on Networking and Information Theory, July 2-7, 1999, Metsovo, Greece, 1999
    English
  • An information-spectrum approach to capacity theorems for the general multiple-access channel
    TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 44, 7, 2773-2795, Nov. 1998, Peer-reviwed, The paper deals,vith the capacity problems for the general multiple-access channel, where the channel transition probabilities are arbitrary for every blocklength n, The approach used here, which is called the information-spectrum approach, is quite different from the standard typical-sequence and/or AEP techniques. The general formula for the capacity region for the general multiple-access channel is thus established, In order to examine the potentiality, we apply it to the mixed channel to obtain the formula for its capacity region. The case where input cost constraints are imposed is also considered.
    Scientific journal, English
  • Statistical interence under multiterminal data compression (invited paper)
    T. S. Han; S. Amari
    IEEE Transactions on Information Theory, 44, 6, 2300-2324, Oct. 1998, Peer-reviwed
    Scientific journal, English
  • An information-spectrum approach to capacity theorems for the general multiple-access channel
    Te Sun Han
    IEEE Transactions on Information Theory, 44, 7, 2773-2795, 1998, Peer-reviwed, The paper deals with the capacity problems for the general multiple-access channel, where the channel transition probabilities are arbitrary for every blocklength n. The approach used here, which is called the information-spectrum approach, is quite different from the standard typical-sequence and/or AEP techniques. The general formula for the capacity region for the general multiple-access channel is thus established. In order to examine the potentiality, we apply it to the mixed channel to obtain the formula for its capacity region. The case where input cost constraints are imposed is also considered. © 1998 IEEE.
    Scientific journal, English
  • "Information-spectrum methods in information theory," (invited talk)
    T.S. Han
    IEEE Workshop on Information Theorey, San Diego, USA, February 9-12 , 1998, 1998
    Scientific journal, English
  • "From the method of types toward information-spectrum methods," (invited talk)
    T.S. Han
    Academy Colloquim, Information Theory: The first 50 years and beyond, Royal Netherlands Academy of Arts and Sciences, Amsterdam, The Netherlands, June 17-19, 1998, 1998
    Scientific journal, English
  • An information-spectrum approach to source coding theorems with a fidelity criterion
    TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 43, 4, 1145-1164, Jul. 1997, Peer-reviwed, The rate-distortion problem for the general class of nonstationary and/or nonergodic sources with an arbitrary distortion measure (not necessarily additive) is studied, We are especially concerned with the case of variable-rate coding under maximum-distortion criterion, It turns out that, in the framework where we cannot readily invoke the standard asymptotic equipartition property, an information-spectrum approach devised by Hg, and Verdu plays the key role in establishing such a general formula, Comparisons with the rate-distortion formulas with fixed-rate coding of Steinberg and Verdu are also discussed to obtain an insight into the general features of this kind of nonstationary and nonergodic problems.
    Scientific journal, English
  • The role of the asymptotic equipartition property in noiseless source coding
    S Verdu; TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 43, 3, 847-857, May 1997, Peer-reviwed, The (noiseless) fixed-length source coding theorem states that, except for outcomes in a set of vanishing probability, a source can be encoded at its entropy but not more efficiently, It is well known that the Asymptotic Equipartition Property (AEP) is a sufficient condition for a source to be encodable at its entropy, This paper shows that the AEP is necessary for the source coding theorem to hold for nonzero-entropy finite-alphabet sources, Furthermore, we show that a nonzero-entropy finite-alphabet source satisfies the direct coding theorem if and only if it satisfies the strong converse, In addition, we introduce the more general setting of nonserial information sources which need not put out strings of symbols, In this context, which encompasses the conventional serial setting, the AEP is equivalent to the validity of the strong coding theorem, Fundamental limits for data compression of nonserial information sources are shown based on the flat-top property-a new sufficient condition for the AEP.
    Scientific journal, English
  • Universal coding of integers and unbouded search trees
    R. Ahlswede; T.S. Han; K. Kobayashi
    IEEE Transactions on Information Theory, IT-43, 2, 669-682, 1997, Peer-reviwed
    English
  • The strong converse for source coding with a fidelity criterion (in Russian)
    T.S.Han; H. Ooishi
    Problems of Information Transmission, 32, 82-90, 1996, Peer-reviwed
    English
  • (invited talk) "Interval algorithm for random number generation," IEEE Workshop on Information Theorey, Haifa, Israel, June 9 - 13, 1996
    T.S. Han; M. Hoshi
    IEEE Workshop on Information Theorey, Haifa, Israel, June 9 - 13, 1996, 1996
    Scientific journal, English
  • The asymptotics of posterior entropy and error probability for Bayesian estimation
    F Kanaya; TS Han
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 41, 6, 1988-1992, Nov. 1995, Peer-reviwed, We consider the Bayesian parameter estimation problem where the value of a finitary parameter X should be decided on the basis of i.i.d. sample Y-n of size n. In this context, the amount of missing information on X after observing Y-n may be evaluated by the posterior entropy, which is often oiled the equivocation or the conditional entropy, of X given Y-n, while it is well known that the minimum possible probability of error in estimating X is achieved by the maximum a posteriori probability (MAP) estimator, In this work, the focus is on the asymptotic relation between the posterior entropy and the MAP error probability as the sample size n becomes sufficiently large, It is shown that if the sample size n is targe enough, the posterior entropy as web as the MAP error probability decay with n to zero at the identical exponential Fate, and that the maximum achievable exponent for this decay is determined by the minimum Chernoff information over all the possible pairs of distinct parameter values, The results presented in this correspondence may be considered as a simpler derivation and also a generalization of the prior work of Renyi, Hellman, and Raviv.
    Scientific journal, English
  • "Nonserial source coding," (invited talk)
    S. Verdu; T.S. Han
    IEEE Workshop on Information Theorey, Rydzyna, Poland, June 25 - 29, 1995, 1995
    Scientific journal, English
  • UNIVERSAL CODING FOR THE SLEPIAN-WOLF DATA-COMPRESSION SYSTEM AND THE STRONG CONVERSE THEOREM
    Y OOHAMA; TS HAN
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 40, 6, 1908-1919, Nov. 1994, Peer-reviwed, Universal coding for the Slepian-Wolf data compression system is considered. We shall demonstrate based on a simple observation that the error exponent given by Csiszar and Korner for the universal coding system can strictly be sharpened in general for a region of relatively higher rates. This kind of observation can be carried over also to the case of lower rates outside the Slepian-Wolf region, which establishes the strong converse along with the optimal exponent.
    Scientific journal, English
  • A GENERAL FORMULA FOR CHANNEL CAPACITY
    S VERDU; TS HAN
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 40, 4, 1147-1157, Jul. 1994, Peer-reviwed, A formula for the capacity of arbitrary single-user channels without feedback (not necessarily information stable, stationary, etc.) is proved. Capacity is shown to equal the supremum, over all input processes, of the input-output information rate defined as the liminf in probability of the normalized information density. The key to this result is a new converse approach based on a simple new lower bound on the error probability of m-ary hypothesis tests among equiprobable hypotheses. A necessary and sufficient condition for the validity of the strong converse is given, as well as general expressions for is-an-element-of-capacity.
    Scientific journal, English
  • GENERALIZING THE FANO INEQUALITY
    TS HAN; S VERDU
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 40, 4, 1247-1251, Jul. 1994, Peer-reviwed, The Fano inequality gives a lower bound on the mutual information between two random variables that take values on an M-element set, provided at least one of the random variables is equiprobable. We show several simple lower bounds on mutual information which do not assume such a restriction. In particular, this can be accomplished by replacing log M with the infinite-order Renyi entropy in the Fano inequality. Applications to hypothesis testing are exhibited along with bounds on mutual information in terms of the a priori and a posteriori error probabilities.
    Scientific journal, English
  • APPROXIMATION-THEORY OF OUTPUT STATISTICS
    TS HAN; S VERDU
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 39, 3, 752-772, May 1993, Peer-reviwed, Given a channel and an input process, the minimum randomness of those input processes whose output statistics approximate the original output statistics with arbitrary accuracy is studied. The notion of resolvability of a channel, defined as the number of random bits required per channel use in order to generate an input that achieves arbitrarily accurate approximation of the output statistics for any given input process, is introduced. A general formula for resolvability that holds regardless of the channel memory structure, is obtained. It is shown that, for most channels, resolvability is equal to Shannon capacity. By-products of the analysis are a general formula for the minimum achievable (fixed-length) source coding rate of any finite-alphabet source, and a strong converse of the identification coding theorem, which holds for any channel that satisfies the strong converse of the channel coding theorem.
    Scientific journal, English
  • Spectrum invariancy under output approximation for full-rank discrete memoryless channels (in Russian)
    T.S. Han
    Problems of Information Transmission, 29, 2, 9-27, 1993, Peer-reviwed
    Scientific journal, English
  • "Zero-rate multiterminal data compression and statistical inference"(invited talk)
    T. S. Han
    Interdisciplinary Forum on Information and Geometry Hakone, Japan, March 14- 20, 1993, 1993, 1993
    Scientific journal, English
  • (invited talk) "Statistical inference under multiterminal zero-rate data compression"
    T.S. Han; S. Amari
    Workshop on Statistical Inference, Differential Geometry and Computer Algebra, Sandbjerg, Denmark, May 9 - 14, 1993, 1993, 1993
    Scientific journal, English
  • "A new approach to the converse of channel coding theorem" (invited talk)
    T. S. Han
    IEEE Workshop on Information Theory, Susono, Japan, June 4 - 8,1993, 1993
    Scientific journal, English
  • (invited talk) "A general formula for channel capacity"
    T.S. Han; S. Verdu
    The Sixth Swedish-Russian International Workshop on Information-Theory, Molle, Sweden, August 22 - 27, 1993, 1993, 1993
    Scientific journal, English
  • MULTITERMINAL FILTERING FOR DECENTRALIZED DETECTION SYSTEMS
    TS HAN; K KOBAYASHI
    IEICE TRANSACTIONS ON COMMUNICATIONS, IEICE-INST ELECTRONICS INFORMATION COMMUNICATIONS ENG, E75B, 6, 437-444, Jun. 1992, The optimal coding strategy for signal detection in the correlated gaussian noise is established for the distributed sensors system with essentially zero transmission rate constraint. Specifically, we are able to obtain the same performance as in the situation of no restriction on rate from each sensor terminal Lo the fusion center. This simple result contrasts with the previous ad hoc studies containing many unnatural assumptions such as the independence of noises contaminating received signal at each sensor. For the design of optimal coder, we can use the classical Levinson-Wiggins-Robinson fast algorithm for block Toeplitz matrix to evaluate the necessary weight vector for the maximum-likelihood detection.
    Scientific journal, English
  • NEW RESULTS IN THE THEORY OF IDENTIFICATION VIA CHANNELS
    TS HAN; S VERDU
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 38, 1, 14-25, Jan. 1992, Peer-reviwed, The identification capacity is the maximal iterated logarithm of the number of messages divided by the blocklength that can be reliably transmitted when the receiver is only interested in deciding whether a specific message was transmitted or not. The identification coding theorem of Ahlswede and Dueck for single-user discrete memoryless channels states that the identification capacity is equal to the Shannon capacity. A new method to prove the converse to the identification coding theorem is shown to achieve the strong version of the result. Identification plus transmission (IT) coding, a variant of the original problem of identification via channels, is proposed in the context of a common problem in Point-to-multipoint communication, where a central station wishes to transmit information reliably to one of N terminals, whose identity is not predetermined. We show that as long as log log N is smaller than the number of bits to be transmitted, IT codes allow information transmission at channel capacity.
    Scientific journal, English
  • (invited talk) "Channel resolvability "
    T.S. Han
    IEEE Workshop on Information Theory, Salvador, Brazil, June 21 - 27, 1992, 1992
    Scientific journal, English
  • A CONVERSE ON THE ERROR EXPONENT FOR BINARY CHANNELS - AN APPROACH BASED ON THE INTERSECTION OF SPHERES
    TS HAN; K KOBAYASHI
    IEICE TRANSACTIONS ON COMMUNICATIONS ELECTRONICS INFORMATION AND SYSTEMS, IEICE-INST ELECTRON INFO COMMUN ENG, 74, 9, 2465-2472, Sep. 1991, Peer-reviwed, One of the basic problems in Information Theory, that is, the determination of the reliability function of binary symmetric channel, is studied by establishing the exponent of cardinality of intersection of two Hamming spheres.
    Scientific journal, English
  • Feedback Codes With Uniformly Bounded Codeword Lengths and Zero-Error Capacities
    Te Sun Han; Hajime Sato
    IEEE Transactions on Information Theory, 37, 3, 655-660, 1991, Peer-reviwed, A certain class of variable-length codes with feedback whose codeword lengths are uniformly upper bounded is considered. for this class of variable-length codes, it is shown that the zero-error capacity region for the single-user and multiuser channels with feedback can be extended up to the ordinary average-error capacity under some conditions, if we use variable-length codes (semiblock codes) in place of fixed-length codes. This condition is different from that of Burnashev for variable-length codes with feedback but without any uniform bound on the codeword lengths. It is also shown that the capacity region for variable-length feedback codes coincides with that for fixed-length feedback codes. © IEEE, 1991.
    Scientific journal, English
  • A converse on the error exponent for binary channels: An approach based on the intersection of spheres
    T.S.Han; K.Kobayashi
    IEICE Transactions, E-74, 9, 2465-2472, 1991, Peer-reviwed
    Scientific journal, English
  • (invited talk) "Statistical inference and multiterminal information theory"
    S. Amari; T.S. Han
    The 44th Session of the International Statistical Institute, Cairo,Egypt, September 9-17, 1991, 1991
    Scientific journal, English
  • Optimal statistical inference with zero-rate multiterminal data compression" Laboratory for Information and decision systems, (invited talk)
    T.S. Han
    Massachusetts Institute of Technology, USA, March 7, 1991, 1991
    Scientific journal, Japanese
  • EXPONENTIAL-TYPE ERROR PROBABILITIES FOR MULTITERMINAL HYPOTHESIS-TESTING
    TS HAN; K KOBAYASHI
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 35, 1, 2-14, Jan. 1989, Peer-reviwed
    Scientific journal, English
  • The strong converse for hypothesis testings
    T.S.Han; K.Kobayashi
    IEEE Transactions on Information Theory, IT-35, no.1, 2-14, 1989, Peer-reviwed
    Scientific journal, English
  • Statistical inference under multi-terminal rate constraints: a differential geometrical approach
    S.Amari; T.S. Han
    IEEE Transactions on Information Theory, IT-35, 2, 217-227, 1989, Peer-reviwed
    Scientific journal, English
  • BROADCAST CHANNELS WITH ARBITRARILY CORRELATED SOURCES
    TS HAN; MHM COSTA
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 33, 5, 641-650, Sep. 1987, Peer-reviwed
    Scientific journal, English
  • Hypothesis testing with multiterminal data compression
    T.S.Han
    IEEE Transactions on Information Theory, vol.IT-33, no.6, vol.IT-33, 1987, Peer-reviwed
    English
  • A dichotomy of functions F(X,Y) of correlated sources (X,Y) from the viewpoint of the achievable rate region
    T.S.Han; K.Kobayashi
    IEEE Transactions on Information Theory, IT-33, 1, 1987, Peer-reviwed
    Scientific journal, English
  • (invited talk) "Multiterminal hypothesis testing and exponential errors"
    T.S. Han
    USSR-Swedish Workshop on Information Theory, Sochi, USSR, June, 1987, 1987
    Scientific journal, English
  • (invited talk) "Hypothesis testing with multiterminal data compression"
    T. S. Han
    Workshop on Information Theory, Oberwolfach, Germany, May, 1986, 1986
    Scientific journal, English
  • MAXIMAL RECTANGULAR SUBSETS CONTAINED IN THE SET OF PARTIALLY JOINTLY TYPICAL SEQUENCES FOR DEPENDENT RANDOM-VARIABLES
    TS HAN; K KOBAYASHI
    ZEITSCHRIFT FUR WAHRSCHEINLICHKEITSTHEORIE UND VERWANDTE GEBIETE, SPRINGER VERLAG, 70, 1, 15-32, 1985, Peer-reviwed
    Scientific journal, English
  • A general coding scheme for the two-way channel
    T.S.Han
    IEEE Transactions on Information Theory, IT-30, 1, 35-44, 1984, Peer-reviwed
    Scientific journal, English
  • ON SOURCE-CODING WITH SIDE INFORMATION VIA A MULTIPLE-ACCESS CHANNEL AND RELATED PROBLEMS IN MULTI-USER INFORMATION-THEORY
    R AHLSWEDE; TS HAN
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 29, 3, 396-412, 1983, Peer-reviwed
    Scientific journal, English
  • THE CAPACITY REGION FOR THE DETERMINISTIC BROADCAST CHANNEL WITH A COMMON MESSAGE
    TS HAN
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 27, 1, 122-125, 1981, Peer-reviwed
    Scientific journal, English
  • A NEW ACHIEVABLE RATE REGION FOR THE INTERFERENCE CHANNEL
    TS HAN; K KOBAYASHI
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 27, 1, 49-60, 1981, Peer-reviwed
    Scientific journal, English
  • Slepian-Wolf-Cover theorem for a network of channels
    T.S.Han
    Information and Control, 47, 67-83, 1980, Peer-reviwed
    Scientific journal, Japanese
  • MULTIPLE MUTUAL INFORMATIONS AND MULTIPLE INTERACTIONS IN FREQUENCY DATA
    TS HAN
    INFORMATION AND CONTROL, ACADEMIC PRESS INC JNL-COMP SUBSCRIPTIONS, 46, 1, 26-45, 1980, Peer-reviwed
    Scientific journal, English
  • A UNIFIED ACHIEVABLE RATE REGION FOR A GENERAL-CLASS OF MULTI-TERMINAL SOURCE-CODING SYSTEMS
    TS HAN; K KOBAYASHI
    IEEE TRANSACTIONS ON INFORMATION THEORY, IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC, 26, 3, 277-288, 1980, Peer-reviwed
    Scientific journal, English
  • Source coding with cross observation at the encoders
    T.S.Han
    IEEE Transactions on Information Theory, IT-25, 3, 360-361, 1979, Peer-reviwed
    English
  • The capacity region of general multiple access channel with certain correlated sources
    T.S.Han
    Information and Control, 40, 1, 37-60, 1979, Peer-reviwed
    Scientific journal, English
  • Nonnegative entropy measures of multivariate symmetric correlations
    T.S.Han
    Information and Control, 36, 2, 133-156, 1978, Peer-reviwed
    Scientific journal, English
  • Linear dependence structure of the entropy space
    T.S.Han
    Information and Control, 29, 3, 337-368, 1975, Peer-reviwed
    Scientific journal, English
  • An intrinsic structure of the auditory sensation space with special reference to the equal-sensation contours
    T.S.Han
    Biological Cybernetics, 20, 1, 27-36, 1975, Peer-reviwed
    English

Books and other publications

  • Information-Spectrum Methods in Information Theory
    T.S.Han
    English, Springer-verlag, Heidelberg, 2002
  • Mathematics of Information and Coding
    Te Sun Han; Kingo Kobayashi
    English, American Mathematical Society, 2001
  • "情報と符号化の数理"
    韓 太舜; 小林欣吾
    Japanese, 培風館, Oct. 1999
  • 情報理論における情報スペクトル的方法
    韓 太 舜
    Japanese, 培風館, Apr. 1998

Lectures, oral presentations, etc.

  • Interval Algorithms for Homophonic Coding,(invited talk)
    M. Hoshi; T. S. Han
    Invited oral presentation, English, Proceedings of the 1999 IEEE Information Theory and Communications Workshop, Kruger National Park, South Africa, June 20-25, International conference
    Jun. 1999
  • "An Information-spectrum approach to information theory" (special invited lecture)
    T. S. Han
    Invited oral presentation, English, IEEE Internal Workshop on Information Theory, Kruger Park, South Africa, June 25-30, 1999, International conference
    Jun. 1999
  • "Weak variable-length source coding" (invited talk)
    T. S. Han
    Invited oral presentation, English, IEEE Internal Workshop on Networking and Information Theory, July 2-7, 1999, Metsovo, Greece, International conference
    1999
  • "Information-spectrum methods in information theory," (invited talk)
    T.S. Han
    Invited oral presentation, English, IEEE Workshop on Information Theorey, San Diego, USA, February 9-12 , 1998, International conference
    1998
  • "From the method of types toward information-spectrum methods," (invited talk)
    T.S. Han
    Invited oral presentation, English, Academy Colloquim, Information Theory: The first 50 years and beyond, Royal Netherlands Academy of Arts and Sciences, Amsterdam, The Netherlands, June 17-19, 1998, International conference
    1998
  • (invited talk) "Interval algorithm for random number generation," IEEE Workshop on Information Theorey, Haifa, Israel, June 9 - 13, 1996
    T.S. Han; M. Hoshi
    Invited oral presentation, English, IEEE Workshop on Information Theorey, Haifa, Israel, June 9 - 13, 1996, International conference
    1996
  • "Nonserial source coding," (invited talk)
    S. Verdu; T.S. Han
    Invited oral presentation, English, IEEE Workshop on Information Theorey, Rydzyna, Poland, June 25 - 29, 1995, International conference
    1995
  • "Zero-rate multiterminal data compression and statistical inference"(invited talk)
    T. S. Han
    Invited oral presentation, English, Interdisciplinary Forum on Information and Geometry Hakone, Japan, March 14- 20, 1993, International conference
    1993
  • (invited talk) "Statistical inference under multiterminal zero-rate data compression"
    T.S. Han; S. Amari
    Invited oral presentation, English, Workshop on Statistical Inference, Differential Geometry and Computer Algebra, Sandbjerg, Denmark, May 9 - 14, 1993, International conference
    1993
  • "A new approach to the converse of channel coding theorem" (invited talk)
    T. S. Han
    Invited oral presentation, English, IEEE Workshop on Information Theory, Susono, Japan, June 4 - 8,1993, International conference
    1993
  • (invited talk) "A general formula for channel capacity"
    T.S. Han; S. Verdu
    Invited oral presentation, English, The Sixth Swedish-Russian International Workshop on Information-Theory, Molle, Sweden, August 22 - 27, 1993, International conference
    1993
  • (invited talk) "Channel resolvability "
    T.S. Han
    Invited oral presentation, English, IEEE Workshop on Information Theory, Salvador, Brazil, June 21 - 27, 1992, International conference
    1992
  • (invited talk) "Statistical inference and multiterminal information theory"
    S. Amari; T.S. Han
    Invited oral presentation, English, The 44th Session of the International Statistical Institute, Cairo,Egypt, September 9-17, 1991, International conference
    1991
  • Optimal statistical inference with zero-rate multiterminal data compression" Laboratory for Information and decision systems, (invited talk)
    T.S. Han
    Invited oral presentation, Japanese, Massachusetts Institute of Technology, USA, March 7, 1991, International conference
    1991
  • (invited talk) "Multiterminal hypothesis testing and exponential errors"
    T.S. Han
    Invited oral presentation, English, USSR-Swedish Workshop on Information Theory, Sochi, USSR, June, 1987, International conference
    1987
  • (invited talk) "Hypothesis testing with multiterminal data compression"
    T. S. Han
    Invited oral presentation, English, Workshop on Information Theory, Oberwolfach, Germany, May, 1986, International conference
    1986

Affiliated academic society

  • Society of Information Theory and its Applications
  • IEEE Information Theory Society
  • 電子情報通信学会
  • IEEE(Information Theory Society)