\chapter{外文资料原文} \label{cha:engorg} \title{The title of the English paper} \textbf{Abstract:} As one of the most widely used techniques in operations research, \emph{ mathematical programming} is defined as a means of maximizing a quantity known as \emph{bjective function}, subject to a set of constraints represented by equations and inequalities. Some known subtopics of mathematical programming are linear programming, nonlinear programming, multiobjective programming, goal programming, dynamic programming, and multilevel programming$^{[1]}$. It is impossible to cover in a single chapter every concept of mathematical programming. This chapter introduces only the basic concepts and techniques of mathematical programming such that readers gain an understanding of them throughout the book$^{[2,3]}$. \section{Single-Objective Programming} The general form of single-objective programming (SOP) is written as follows, \begin{equation}\tag*{(123)} % 如果附录ä¸çš„å…¬å¼ä¸æƒ³è®©å®ƒå‡ºçŽ°åœ¨å…¬å¼ç´¢å¼•ä¸ï¼Œé‚£å°±è¯· % 用 \tag*{xxxx} \left\{\begin{array}{l} \max \,\,f(x)\\[0.1 cm] \mbox{subject to:} \\ [0.1 cm] \qquad g_j(x)\le 0,\quad j=1,2,\cdots,p \end{array}\right. \end{equation} which maximizes a real-valued function $f$ of $x=(x_1,x_2,\cdots,x_n)$ subject to a set of constraints. \newtheorem{mpdef}{Definition}[chapter] \begin{mpdef} In SOP, we call $x$ a decision vector, and $x_1,x_2,\cdots,x_n$ decision variables. The function $f$ is called the objective function. The set \begin{equation}\tag*{(456)} % 这里åŒç†ï¼Œå…¶å®ƒä¸å†ä¸€ä¸€æŒ‡å®šã€‚ S=\left\{x\in\Re^n\bigm|g_j(x)\le 0,\,j=1,2,\cdots,p\right\} \end{equation} is called the feasible set. An element $x$ in $S$ is called a feasible solution. \end{mpdef} \newtheorem{mpdefop}[mpdef]{Definition} \begin{mpdefop} A feasible solution $x^*$ is called the optimal solution of SOP if and only if \begin{equation} f(x^*)\ge f(x) \end{equation} for any feasible solution $x$. \end{mpdefop} One of the outstanding contributions to mathematical programming was known as the Kuhn-Tucker conditions\ref{eq:ktc}. In order to introduce them, let us give some definitions. An inequality constraint $g_j(x)\le 0$ is said to be active at a point $x^*$ if $g_j(x^*)=0$. A point $x^*$ satisfying $g_j(x^*)\le 0$ is said to be regular if the gradient vectors $\nabla g_j(x)$ of all active constraints are linearly independent. Let $x^*$ be a regular point of the constraints of SOP and assume that all the functions $f(x)$ and $g_j(x),j=1,2,\cdots,p$ are differentiable. If $x^*$ is a local optimal solution, then there exist Lagrange multipliers $\lambda_j,j=1,2,\cdots,p$ such that the following Kuhn-Tucker conditions hold, \begin{equation} \label{eq:ktc} \left\{\begin{array}{l} \nabla f(x^*)-\sum\limits_{j=1}^p\lambda_j\nabla g_j(x^*)=0\\[0.3cm] \lambda_jg_j(x^*)=0,\quad j=1,2,\cdots,p\\[0.2cm] \lambda_j\ge 0,\quad j=1,2,\cdots,p. \end{array}\right. \end{equation} If all the functions $f(x)$ and $g_j(x),j=1,2,\cdots,p$ are convex and differentiable, and the point $x^*$ satisfies the Kuhn-Tucker conditions (\ref{eq:ktc}), then it has been proved that the point $x^*$ is a global optimal solution of SOP. \subsection{Linear Programming} \label{sec:lp} If the functions $f(x),g_j(x),j=1,2,\cdots,p$ are all linear, then SOP is called a {\em linear programming}. The feasible set of linear is always convex. A point $x$ is called an extreme point of convex set $S$ if $x\in S$ and $x$ cannot be expressed as a convex combination of two points in $S$. It has been shown that the optimal solution to linear programming corresponds to an extreme point of its feasible set provided that the feasible set $S$ is bounded. This fact is the basis of the {\em simplex algorithm} which was developed by Dantzig as a very efficient method for solving linear programming. \begin{table}[ht] \centering \centering \caption*{Table~1\hskip1em This is an example for manually numbered table, which would not appear in the list of tables} \label{tab:badtabular2} \begin{tabular}[c]{|m{1.5cm}|c|c|c|c|c|c|}\hline \multicolumn{2}{|c|}{Network Topology} & \# of nodes & \multicolumn{3}{c|}{\# of clients} & Server \\\hline GT-ITM & Waxman Transit-Stub & 600 & \multirow{2}{2em}{2\%}& \multirow{2}{2em}{10\%}& \multirow{2}{2em}{50\%}& \multirow{2}{1.2in}{Max. Connectivity}\\\cline{1-3} \multicolumn{2}{|c|}{Inet-2.1} & 6000 & & & &\\\hline & \multicolumn{2}{c|}{ABCDEF} &\multicolumn{4}{c|}{} \\\hline \end{tabular} \end{table} Roughly speaking, the simplex algorithm examines only the extreme points of the feasible set, rather than all feasible points. At first, the simplex algorithm selects an extreme point as the initial point. The successive extreme point is selected so as to improve the objective function value. The procedure is repeated until no improvement in objective function value can be made. The last extreme point is the optimal solution. \subsection{Nonlinear Programming} If at least one of the functions $f(x),g_j(x),j=1,2,\cdots,p$ is nonlinear, then SOP is called a {\em nonlinear programming}. A large number of classical optimization methods have been developed to treat special-structural nonlinear programming based on the mathematical theory concerned with analyzing the structure of problems. Now we consider a nonlinear programming which is confronted solely with maximizing a real-valued function with domain $\Re^n$. Whether derivatives are available or not, the usual strategy is first to select a point in $\Re^n$ which is thought to be the most likely place where the maximum exists. If there is no information available on which to base such a selection, a point is chosen at random. From this first point an attempt is made to construct a sequence of points, each of which yields an improved objective function value over its predecessor. The next point to be added to the sequence is chosen by analyzing the behavior of the function at the previous points. This construction continues until some termination criterion is met. Methods based upon this strategy are called {\em ascent methods}, which can be classified as {\em direct methods}, {\em gradient methods}, and {\em Hessian methods} according to the information about the behavior of objective function $f$. Direct methods require only that the function can be evaluated at each point. Gradient methods require the evaluation of first derivatives of $f$. Hessian methods require the evaluation of second derivatives. In fact, there is no superior method for all problems. The efficiency of a method is very much dependent upon the objective function. \subsection{Integer Programming} {\em Integer programming} is a special mathematical programming in which all of the variables are assumed to be only integer values. When there are not only integer variables but also conventional continuous variables, we call it {\em mixed integer programming}. If all the variables are assumed either 0 or 1, then the problem is termed a {\em zero-one programming}. Although integer programming can be solved by an {\em exhaustive enumeration} theoretically, it is impractical to solve realistically sized integer programming problems. The most successful algorithm so far found to solve integer programming is called the {\em branch-and-bound enumeration} developed by Balas (1965) and Dakin (1965). The other technique to integer programming is the {\em cutting plane method} developed by Gomory (1959). \hfill\textit{Uncertain Programming\/}\quad(\textsl{BaoDing Liu, 2006.2}) \section*{References} \noindent{\itshape NOTE: These references are only for demonstration. They are not real citations in the original text.} \begin{translationbib} \item Donald E. Knuth. The \TeX book. Addison-Wesley, 1984. ISBN: 0-201-13448-9 \item Paul W. Abrahams, Karl Berry and Kathryn A. Hargreaves. \TeX\ for the Impatient. Addison-Wesley, 1990. ISBN: 0-201-51375-7 \item David Salomon. The advanced \TeX book. New York : Springer, 1995. ISBN:0-387-94556-3 \end{translationbib} \chapter{å¤–æ–‡èµ„æ–™çš„è°ƒç ”é˜…è¯»æŠ¥å‘Šæˆ–ä¹¦é¢ç¿»è¯‘} \title{英文资料的ä¸æ–‡æ ‡é¢˜} {\heiti 摘è¦ï¼š} æœ¬ç« ä¸ºå¤–æ–‡èµ„æ–™ç¿»è¯‘å†…å®¹ã€‚å¦‚æžœæœ‰æ‘˜è¦å¯ä»¥ç›´æŽ¥å†™ä¸Šæ¥ï¼Œè¿™éƒ¨åˆ†å¥½åƒæ²¡æœ‰ 明确的规定。 \section{å•ç›®æ ‡è§„划} 北冥有鱼,其å为鲲。鲲之大,ä¸çŸ¥å…¶å‡ åƒé‡Œä¹Ÿã€‚化而为鸟,其å为é¹ã€‚é¹ä¹‹èƒŒï¼Œä¸çŸ¥å…¶å‡ åƒé‡Œä¹Ÿã€‚怒而飞,其翼若垂天之云。是鸟也,海è¿åˆ™å°†å¾™äºŽå—冥。å—å†¥è€…ï¼Œå¤©æ± ä¹Ÿã€‚ \begin{equation}\tag*{(123)} p(y|\mathbf{x}) = \frac{p(\mathbf{x},y)}{p(\mathbf{x})}= \frac{p(\mathbf{x}|y)p(y)}{p(\mathbf{x})} \end{equation} å¾ç”Ÿä¹Ÿæœ‰æ¶¯ï¼Œè€ŒçŸ¥ä¹Ÿæ— 涯。以有涯éšæ— 涯,殆已ï¼å·²è€Œä¸ºçŸ¥è€…,殆而已矣ï¼ä¸ºå–„æ— è¿‘å,为 æ¶æ— 近刑,缘ç£ä»¥ä¸ºç»ï¼Œå¯ä»¥ä¿èº«ï¼Œå¯ä»¥å…¨ç”Ÿï¼Œå¯ä»¥å…»äº²ï¼Œå¯ä»¥å°½å¹´ã€‚ \subsection{线性规划} 庖ä¸ä¸ºæ–‡æƒ å›è§£ç‰›ï¼Œæ‰‹ä¹‹æ‰€è§¦ï¼Œè‚©ä¹‹æ‰€å€šï¼Œè¶³ä¹‹æ‰€å±¥ï¼Œè†ä¹‹æ‰€å€šï¼Œç ‰ç„¶å“然,å¥åˆ€é¨žç„¶ï¼ŒèŽ« ä¸ä¸éŸ³ï¼ŒåˆäºŽæ¡‘林之舞,乃ä¸ç»é¦–之会。 \begin{table}[ht] \centering \centering \caption*{表~1\hskip1em 这是手动编å·ä½†ä¸å‡ºçŽ°åœ¨ç´¢å¼•ä¸çš„ä¸€ä¸ªè¡¨æ ¼ä¾‹å} \label{tab:badtabular3} \begin{tabular}[c]{|m{1.5cm}|c|c|c|c|c|c|}\hline \multicolumn{2}{|c|}{Network Topology} & \# of nodes & \multicolumn{3}{c|}{\# of clients} & Server \\\hline GT-ITM & Waxman Transit-Stub & 600 & \multirow{2}{2em}{2\%}& \multirow{2}{2em}{10\%}& \multirow{2}{2em}{50\%}& \multirow{2}{1.2in}{Max. Connectivity}\\\cline{1-3} \multicolumn{2}{|c|}{Inet-2.1} & 6000 & & & &\\\hline & \multicolumn{2}{c|}{ABCDEF} &\multicolumn{4}{c|}{} \\\hline \end{tabular} \end{table} æ–‡æƒ å›æ›°ï¼šâ€œå˜»ï¼Œå–„哉ï¼æŠ€ç›–至æ¤ä¹Žï¼Ÿâ€åº–ä¸é‡Šåˆ€å¯¹æ›°ï¼šâ€œè‡£ä¹‹æ‰€å¥½è€…é“也,进乎技矣。始臣之 解牛之时,所è§æ— éžå…¨ç‰›è€…;三年之åŽï¼Œæœªå°è§å…¨ç‰›ä¹Ÿï¼›æ–¹ä»Šä¹‹æ—¶ï¼Œè‡£ä»¥ç¥žé‡è€Œä¸ä»¥ç›®è§†ï¼Œ 官知æ¢è€Œç¥žæ¬²è¡Œã€‚ä¾ä¹Žå¤©ç†ï¼Œæ‰¹å¤§éƒ¤ï¼Œå¯¼å¤§çª¾ï¼Œå› 其固然。技ç»è‚¯ç¶®ä¹‹æœªå°ï¼Œè€Œå†µå¤§å¬ä¹Žï¼ 良庖å²æ›´åˆ€ï¼Œå‰²ä¹Ÿï¼›æ—庖月更刀,折也;今臣之刀åä¹å¹´çŸ£ï¼Œæ‰€è§£æ•°åƒç‰›çŸ£ï¼Œè€Œåˆ€åˆƒè‹¥æ–°å‘ äºŽç¡Žã€‚å½¼èŠ‚è€…æœ‰é—´è€Œåˆ€åˆƒè€…æ— åŽšï¼Œä»¥æ— åŽšå…¥æœ‰é—´ï¼Œæ¢æ¢ä¹Žå…¶äºŽæ¸¸åˆƒå¿…有余地矣。是以åä¹å¹´ 而刀刃若新å‘于硎。虽然,æ¯è‡³äºŽæ—,å¾è§å…¶éš¾ä¸ºï¼Œæ€µç„¶ä¸ºæˆ’,视为æ¢ï¼Œè¡Œä¸ºè¿Ÿï¼ŒåŠ¨åˆ€ç”šå¾®ï¼Œ 謋然已解,如土委地。æ刀而立,为之而四顾,为之踌躇满志,善刀而è—ä¹‹ã€‚â€ æ–‡æƒ å›æ›°ï¼šâ€œå–„哉ï¼å¾é—»åº–ä¸ä¹‹è¨€ï¼Œå¾—养生焉。†\subsection{éžçº¿æ€§è§„划} å”å与柳下å£ä¸ºå‹ï¼ŒæŸ³ä¸‹å£ä¹‹å¼Ÿå曰盗跖。盗跖从å’ä¹åƒäººï¼Œæ¨ªè¡Œå¤©ä¸‹ï¼Œä¾µæš´è¯¸ä¾¯ã€‚穴室枢 户,驱人牛马,å–人妇女。贪得忘亲,ä¸é¡¾çˆ¶æ¯å…„弟,ä¸ç¥å…ˆç¥–ã€‚æ‰€è¿‡ä¹‹é‚‘ï¼Œå¤§å›½å®ˆåŸŽï¼Œå° å›½å…¥ä¿ï¼Œä¸‡æ°‘苦之。å”å谓柳下å£æ›°ï¼šâ€œå¤«ä¸ºäººçˆ¶è€…,必能è¯å…¶å;为人兄者,必能教其弟。 若父ä¸èƒ½è¯å…¶å,兄ä¸èƒ½æ•™å…¶å¼Ÿï¼Œåˆ™æ— 贵父å兄弟之亲矣。今先生,世之æ‰å£«ä¹Ÿï¼Œå¼Ÿä¸ºç›— 跖,为天下害,而弗能教也,丘窃为先生羞之。丘请为先生往说之。†柳下å£æ›°ï¼šâ€œå…ˆç”Ÿè¨€ä¸ºäººçˆ¶è€…必能è¯å…¶å,为人兄者必能教其弟,若åä¸å¬çˆ¶ä¹‹è¯ï¼Œå¼Ÿä¸å— 兄之教,虽今先生之辩,将奈之何哉?且跖之为人也,心如涌泉,æ„如飘风,强足以è·æ•Œï¼Œ 辩足以饰éžã€‚é¡ºå…¶å¿ƒåˆ™å–œï¼Œé€†å…¶å¿ƒåˆ™æ€’ï¼Œæ˜“è¾±äººä»¥è¨€ã€‚å…ˆç”Ÿå¿…æ— å¾€ã€‚â€ å”åä¸å¬ï¼Œé¢œå›žä¸ºé©ï¼Œå贡为å³ï¼Œå¾€è§ç›—跖。 \subsection{整数规划} 盗跖乃方休å’徒大山之阳,è„人è‚而餔之。å”å下车而å‰ï¼Œè§è°’者曰:“é²äººå”丘,闻将军 高义,敬å†æ‹œè°’者。â€è°’者入通。盗跖闻之大怒,目如明星,å‘ä¸ŠæŒ‡å† ï¼Œæ›°ï¼šâ€œæ¤å¤«é²å›½ä¹‹ 巧伪人å”丘éžé‚ªï¼Ÿä¸ºæˆ‘å‘Šä¹‹ï¼šå°”ä½œè¨€é€ è¯ï¼Œå¦„称文ã€æ¦ï¼Œå† æžæœ¨ä¹‹å† ,带æ»ç‰›ä¹‹èƒï¼Œå¤šè¾žç¼ª 说,ä¸è€•è€Œé£Ÿï¼Œä¸ç»‡è€Œè¡£ï¼Œæ‘‡å”‡é¼“舌,擅生是éžï¼Œä»¥è¿·å¤©ä¸‹ä¹‹ä¸»ï¼Œä½¿å¤©ä¸‹å¦å£«ä¸å其本,妄 作å弟,而侥幸于å°ä¾¯å¯Œè´µè€…也。å之罪大æžé‡ï¼Œç–¾èµ°å½’ï¼ä¸ç„¶ï¼Œæˆ‘将以åè‚益昼餔之膳。†\chapter{其它附录} å‰é¢ä¸¤ä¸ªé™„录主è¦æ˜¯ç»™æœ¬ç§‘生åšä¾‹å。其它附录的内容å¯ä»¥æ”¾åˆ°è¿™é‡Œï¼Œå½“ç„¶å¦‚æžœä½ æ„¿æ„ï¼Œå¯ ä»¥æŠŠè¿™éƒ¨åˆ†ä¹Ÿæ”¾åˆ°ç‹¬ç«‹çš„æ–‡ä»¶ä¸ï¼Œç„¶åŽå°†å…¶åˆ°ä¸»æ–‡ä»¶ä¸ã€‚