## Cs.concordia.ca

Areas of Intelligent Systems:
Reasoning, Discretization, and Learning
Professor Emeritus of Computer Science
University of Texas at Dallas
Intelligent systemsLevels of thinkingNonmonotonicity and incompletenessEffective questioning3 Problems: Q-All SAT, Q-MINFIX UNSAT, Q-MINFIX SATSeparationsDiscretizationImportant FactorsExplanationsApplication examplesReferences
Intelligent Systems
Definition: Intelligent System
A system is intelligent if it accomplishes feats that, when carriedout by humans, require a substantial amount of intelligence.

Medical diagnosis, processing of natural language, supervisionof complex processes.

Definition: Expert System
An expert system is an intelligent system which in an interac-tive setting asks a person for information and, based upon theresponse, draws conclusions or gives advice.

Definition: Intelligent Agent
An intelligent agent is an intelligent system which perceives itsenvironment by sensors and which uses that information to actupon the environment.

Expert systems and intelligent agents are special cases of intel-ligent systems.

An example of an intelligent system that is not an intelligentagent: an intelligent system that constructs expert systemsfrom data. That intelligent system is not an intelligent agent,unless one accepts the odd notion that creating an expert sys-tem constitutes acting upon an environment.

"Model": In propositional logic, a satisfying solution. Here, amathematical formulation of a real-world situation.

"Valid": In propositional logic, a formula that always evaluatesto True. Here, a formulation that correctly represents a partof the real world of interest.

Levels of Thinking, Informal Definition
First Level (= thinking about problems)
Direct reasoning about problems.

Typical questions:
"Do symptoms s and t imply that disease d or e is present?"
"Is this log-in behavior typical for a hacker?"
Second Level (= thinking about thinking)
We think about which logic module of the first level is to beused. May involve selection, modification, creation of module.

Third Level (= thinking about thinking about think-ing)
We think about which process of the second level is to be used.

Reduction to thinking at lower level generally cannot beachieved by a polynomial algorithm. Note: "cannot" is meantpragmatically in the sense that presently nobody knows of apolynomial reduction.

Link to polynomial hierarchy of theory of computation: kthlevel of thinking corresponds to kth-level of hierarchy.

Quantified Boolean Formula (QBF)
a, b, c, d, . . = vectors of Boolean variables∀a ∃b ∀c ∃d . . D(a, b, c, d, . .)
Each quantifier introduces another level.

Examples: First-Level Thinking
Deduction of theorems T from axioms of a propositional logicformula S.

Logic Minimization
Finding best satisfying solution for a CNF system propositionallogic formula S. Accelerated theorem proving. Handling ofspecial satisfiability cases.

Reasoning Examples
Examples: Second-Level Thinking
Nonmonotonic Reasoning
Conclusion may become invalid when additional information isobtained. Axioms must be modified.

Reasoning with Incomplete Axioms
Desired conclusion cannot be derived. Axioms must be modi-fied.

Discovery of Futility
Decide if desired conclusion cannot be derived even if all as-yet-unknown data are obtained.

Reasoning Examples
Examples: Third-Level Thinking
Selection of Second-Level Reasoning
Decide which type of nonmonotonic reasoning or which discov-ery method of futility is to be used.

Question Selection
Questions are selected so that, as soon as possible, desired con-clusion can be proved or shown to be futile.

Computations constructing intelligent systems that reason atthe first and second levels.

Reasoning Examples
Construction Methods
Direct Construction
Analyze problem. Formulate logic conditions and constraints.

Define and implement all system operations.

Construction via Learning
Obtain data about problem. Derive logic conditions and con-straints from the data. Use a metamethod to construct mod-ules for all system operations.

Construction Methods
An Extension of Propositional Logic
- Cost of True or False for a variable; call this Truecost and
- Representation of unknown values by two values: Absent
and Unavailable.

- Likelihood of clauses being applicable.

- Predicates with finite quantifications ∃ (there exists) and ∀
- Quantification of propositional variables. For example, for
all True/False values of a subset of the variables, there exist
True/False values for the remaining variables so that theformula has a specified value. Can be viewed as special caseof predicate with finite quantification.

Extension of Logic
For variable x: Truecost = 10, Falsecost = −5
U , V finite sets
∀u ∈ U ∀v ∈ V [¬g(u, v) ∨ h(u, v)]
u∈U [¬g(u, v) ∨ h(u, v)]v∈V
x = Absent: value of x unknown, but could be obtainedx = Unavailable: value of x cannot be obtained
Extension of Logic
Example of Quantification of Propositional Variables:
CNF formula R with vectors q and x of variables.

CNF formula S with vectors q and y of variables.

For all True/False values of the qi of vector q:if R is satisfiable, then S is satisfiable as well.

Write as: ∀q (∃x R → ∃y S)
Extension of Logic
First-Level Thinking
Problems SAT and MINSAT
We often use CNF system instead of CNF formula.

CNF system: list of variables; list of CNF clauses, each ofwhich uses a subset of the variables.

Instance: CNF system S.

Solution: Either: A satisfying solution of S. Or: The conclusionthat S is not satisfiable.

Instance: CNF system S. For each variable of S, two rationalcost values associated with the values True and False for thatvariable.

Solution: Either: A satisfying solution of S for which the totalcost is minimum. Or: The conclusion that S is unsatisfiable.

Second-Level Thinking
Deduction: Core approach of mathematics.

Most reasoning in everyday life is not deduction.

1. Operation of light bulb by switch.

2. Learning to drive by experimentation.

Mathematics: Use probability theory and induction to con-vert such reasoning to deduction.

Here: Want to use just one tool, an extension of propositionallogic.

Nonmonotonicity (short)
When additional facts become available, previously proved the-orems remain theorems.

Monotonicity of first-order logic is essential for mathematics.

Conclusion may become invalid by additional information.

1. A conclusion by abduction turns out to be in error.

2. Use of the value Unavailable in a Question-and-Answer pro-
cess eliminates CNF clauses and thus may eliminate theo-rems.

Nonmonotonicity (short)
This is not a case of G¨
odel's Incompleteness Theorem (1931).

We simply do not have the needed clauses.

Statement Y = ¬X → R = X ∨ R should be a theorem, but isnot.

Thus, S ∧ ¬Y = S ∧ ¬X ∧ ¬R is satisfiable, but should beunsatisfiable.

Solution: S0 = S ∧ Y
S00 = S0 ∧ ¬X = S ∧ Y ∧ ¬X may be unsatisfiable, but shouldbe satisfiable. Must modify S0 to eliminate this effect.

Incompleteness (short)
Problem Q-ALL SAT
Subproblem of Treating Futility of Questions in ExpertSystem
Expert system asks for information to prove some result. If theresult cannot be proved at all, the system should stop as soonas this fact becomes evident.

A small example . .

Problem Q-ALL SAT
Questions q1, q2, q3. Defects r, s, t.

CNF system R has relationships connecting q1, q2, q3.

CNF system S links questions and defects
q1 → (r ∨ t)q2 → (¬r ∨ ¬t)(q1 ∧ q2) → ¬rq3 → (s ∨ ¬t)
Suppose we want to prove defect t. We are given value q1 =
False. It is easy to see that we cannot prove t with this infor-mation.

Problem: Can t be possibly proved by some values for q2 andq3?
If answer is "no", then it is useless to ask for the values of q2and q3, and we should stop. Thus, proving t is futile.

Problem Q-ALL SAT
CNF system S0; enforces ¬t for theorem proving.

¬q1 ∨ r ∨ t¬q2 ∨ ¬r ∨ ¬t¬q1 ∨ ¬q2 ∨ ¬r¬q3 ∨ s ∨ ¬t¬t
Proving t is futile if, for all True/False values of q1, q2, q3satisfying R, the CNF system S0 is satisfiable.

Enumeration shows that proving t is indeed futile.

On the other hand, if originally q1 = True, then there are valuesfor q2, q3 so that t can be proved; take q2 = True.

Problem Q-ALL SAT
Definition: An assignment of True/False values to q1, . . , qlis R-acceptable if the SAT instance of R with q1, . . , ql fixedaccording to the given assignment is satisfiable.

Problem Q-ALL SAT
Instance: Satisfiable CNF systems R and S that have l ≥ 1special variables q1, . . , ql among their common variables.

Solution: Either: "All R-acceptable assignments of True/
False values for q1, . . , ql are S-acceptable." Or: An R-accep-table assignment of True/False values for q1, . . , ql that is S-unacceptable.

Problem Q-ALL SAT
Polynomial Hierarchy
a, b, c, d, . . = vectors of Boolean variables∀a ∃b ∀c ∃d . . D(a, b, c, d, . .)
Here: CNF systems R and S
∀q [∃x R → ∃y S]
Transformation to standard form is trivial. However, even thefastest present-day algorithms for the standard case do notperform well on this particular problem, since the structure isnot exploited.

Problem Q-ALL SAT
Exact Algorithm for Q-ALL SAT
Ongoing work with A. Remshagen. Recursively fixes qk vari-ables and tries out effect. Learns clauses and predicts SATbehavior to reduce size of search tree, using an extension ofthe heuristic algorithm given later. As an aside, the algorithmproves some special cases to be in NP.

1. Do unit-resolution in R on all variables. All Q variables
fixed in R by unit-resolution are fixed in S as well.

2. Do unit-resolution in S on the Y variables only.

3. If (Q = ∅)
if (R is satisfiable and S is unsatisfiable)
4. Select a Q variable q;
Problem Q-ALL SAT
5. If (QRSsat(Q {q}, R(q = True), S(q = True)) = True)
Else if (QRSsat(Q {q}, R(q = False), S(q = False)) = True)
Learn Clauses and Make SAT Predictions
1. Algorithm learns clauses before backtracking from a given
fixing. Main idea: Unfix Q variables while showing thatthe same conclusion is still valid. The remaining fixed vari-ables produce a clause that excludes that fixing. The learnedclauses are added to R. An enhanced version of the nextheuristic is used for the learning of the clauses.

2. Predict SAT behavior of S using values of fixed Q and Y,
plus a greedy heuristic. Use predictions for backtracking andfor selection process of next Q variable to be fixed.

Problem Q-ALL SAT
Heuristic Algorithm for Q-ALL SAT
1. If deletion of all free qk reduces S to a satisfiable S0: Fixing
of any qk cannot produce unsatisfiability. Hence, the Q-ALLSAT instance has been solved.

2. (S0 is unsatisfiable) Compute a minimal unsatisfiable subset
of clauses of S0. Let S be corresponding subsystem of S.

Check if qk values exist that satisfy R while causing all lit-erals of the qk in S to evaluate to False.

If such qk exist, then the Q-ALL SAT instance has beensolved.

Problem Q-ALL SAT
Problem Q-ALL SAT: QRSsat3 Test Results
1. Robot navigation through grid with obstacles.

Question: Can obstacles be so placed that they block therobot?18 instances.

2. Game tree where first player tries to prevent opponent from
reaching goal.

Question: Can first player achieve goal?12 sets of 12 instances each.

Problem Q-ALL SAT Results
Computational Results
Solution Times (sec)(1)
(1) Robot case: Each figure is total time (sec) for 18 instances.

Game case: Each figure is average time per set of 12 in-stances.

(2) Error termination in 4 instances.

(3) 1 hr time limit exceeded in 15 out of 18 instances. Used
3,600 sec for these cases in the statistic.

(4) 1 hr limit exceeded for 14 instances.

(5) 1 hr limit exceeded for 4 instances. Error termination for 10
(6) 1 hr limit exceeded for 10 instances.

Robot instances: QRSsat3 at least 200 times faster.

Game instances: QRSsat3 at least 400 times faster.

Problem Q-ALL SAT Results
Optimization Versions of Q-ALL SAT
Problem Q-MINFIX UNSAT
Subproblem of Learning to Ask Relevant Questions
Problem Q-MINFIX UNSAT
Instance: Satisfiable CNF system S containing l ≥ 1 specialvariables q1, . . , ql. An S-unacceptable assignment. For eachqj, a cost of obtaining the given True/False value of qj.

Solution: An S-unacceptable partial assignment with minimumtotal cost.

Must carry out tests with given costs to get sets of values.

Problem Q-MIN UNSAT (short)
In medical diagnostic system, have obtained symptom valuesq1, q2, . . , ql by some tests, and have proved some disease t tobe present.

Want to find out in hindsight which tests should have beendone to get values for some of the variables q1, . . , ql so thatthe disease could have been proved at least total cost.

Problem Q-MIN UNSAT (short)
Problem Q-MINFIX SAT
Subproblem of Learning to Ask Relevant Questions
Problem Q-MINFIX SAT
Instance: Satisfiable CNF systems R and S having l ≥ 1 vari-ables q1, . . , ql among their common variables. An assignmentfor q1, . . , ql that is both R-acceptable and S-acceptable. Foreach qj, a cost of obtaining the given True/False value for qj.

Solution: A partial assignment such that each R-acceptable ex-tension assignment is S-acceptable. Subject to that condition,the total cost of the partial assignment must be minimum.

Must carry out tests with given costs to get sets of values.

Problem Q-MINFIX SAT (short)
In medical diagnostic system, have obtained symptom valuesq1, q2, . . , ql by some tests, and have determined that somedisease t cannot be proved.

Want to find out in hindsight which tests should have been doneto get values for some of the variables q1, . . , ql so that, at leasttotal costs, the same conclusion could have been obtained.

Problem Q-MINFIX SAT (short)
Training sets A and B consist of records of length n. The kthentry is the value for attribute k. Entries may be True, False,or Unavailable.

The value Unavailable means that one cannot get a True/Falsevalue.

Note: Generally, another value is possible: Absent. In thatcase, one does not know value, but can obtain it. In this talk,we will not make use of that option.

Separations (short)
Separation Problem
Find a logic formula that is True on A and False on B, or showthat this cannot be done. The formula separates A from B.

The sets A and B usually are taken from two populations Aand B. A separating formula for A and B may then be usedto predict whether a record is in A or B.

Finding Separating Formulas
There are effective methods to find separating formulas. In ourapproach, the formulas are in disjunctive normal form (DNF),and we construct them by a recursive process.

Example: (x ∧ ¬y ∧ z) ∨ (¬x ∧ v) ∨ (y ∧ w)
Separations (short)
Given: Numerical data sets A and B.

Goal: Logic data sets A0 and B0 representing A and B.

Most popular method: Entropy plus Minimum DescriptionLength Principle (MDLP). The principle generally selects thehypothesis that minimizes the description length of the hypoth-esis plus the description length of the data given the hypothesis.

Here, we apply a new method of pattern analysis.

Discretization (short)
Processing of one Attribute
1. Sort the numerical entries of the attribute. Label each entry
of the sorted list by "A" (resp. "B") if coming from a recordof A (resp. B). The result is a label sequence.

10.8 3.7 2.9 1.7 0.5 −1.0 −3.5 −11.9
3. Find an interval where the sequence switches from mostly
As to mostly Bs.

The more rapid the switch, the more
important the interval. Put a cutpoint c into the middle ofthe interval. Details are given in a moment.

Logic attribute yc: if x ≤ c, then yc = False
if x > c, then yc = True
Discretization (short)
1. In the label sequence, replace each A by 1 and each B by 0.

2. Smooth the sequence of 0s and 1s by Gaussian convolution.

The variance of Gaussian convolution is determined by eval-uation of a competing random process. Goal is smoothing sothat randomly introduce differences are eliminated.

3. Select cutpoint where the smoothed data change by maxi-
mum difference.

10.8 3.7 2.9 1.7 0.5 −1.0 −3.5 −11.9
cutpoint c = (1.7 + 0.5)/2 = 1.1
Discretization (short)
Important Factors
Important Factors
Discussion via example of cancer data.

Cervical Cancer: FIGO I-III versus Recurrence
Derive factors explaining difference between FIGO I-III andRecurrence, using lab data.

1. Partition the data into FIGO I-III cases and Recurrence
2. Discretize the two data sets, getting sets A for FIGO I-III
and B for Recurrence.

Important Factors (short)
3. Compute a separating logic formula achieving True for A
and False for B.

For each literal (= occurrence of a possibly negated variable)of the formula, count the number of records for which theliteral is needed to achieve True for the formula. The countis the importance value of the literal for the formula.

Repeat the above step, but exchange the roles of A and B.

4. For each literal l, normalize the two importance values using
the number of records of A or B, whichever applies. Thus,get average importance values f
5. The variables for which at least one of the two possible lit-
≥ 0.4, are the important factors.

Important Factors (short)
Discussion uses example of FIGO I-III versus Recurrence.

1. Delete from the data all attributes except for the important
factors determined in the previous step.

2. Discretize the sets of FIGO I-III and Recurrence records
using the important factors.

3. Compute a logic formula that is True on A and False on B,
and a second formula that is False on A and True on B.

The logic formulas, combined with the discretization infor-mation, constitute the desired explanations.

Explanations (short)
Statistical Significance
Statistical Significance
We define a statistic with 0/1 value via the explanations ob-tained in the previous step.

Hypothesis H0: The explanations produce accurate predic-tions.

Hypothesis H1: The explanations do not produce accurate pre-dictions. Indeed, with some luck, the same accuracy can beachieved by flipping an unbiased coin, which statistically is aBernoulli trial with α = 0.5.

Statistical Significance
Procedure: Find Explanations and Establish Signifi-cance
1. Split the given data into a training set and a testing set.

2. Obtain explanations from the training data using the earlier
described process.

3. Apply the explanations to the testing data and determine
how often the explanations are correct/incorrect.

4. Compare the outcome of Step 3 with results of Bernoulli
trials with α = 0.5.

5. Use direct computation or approximation by normal distri-
bution to obtain probability p that Bernoulli trials obtainthe same results or better. If p is very small, then acceptH0. Otherwise, accept H1.

Statistical Significance
Explanation Examples
Explanation Examples: Breast Cancer
The data sets used in this subsection were supplied by theFrauenklinik of the Ruhr-Universit¨
1. Herceptin/Xeloda Study
Records have 38 attributes:
SURVZ (living at present)HER2 (value of second test)Thymidinphosphorylase:
TP TUMORTP TISSUETP OVERALL
VEGF (Vascular Endothelial Growth Factor)COX2 (Cyclooxygenase 2)K18 (Keratin 18)
Explanation Examples (short)
HAT AD MED (adjuvant hormone therapy: medication
(1 = tamoxifen,2 = GnRH-Analogon,
3 = Aromatase Inhibitor,4 = other,?))
HER2 STAT (HER2 status (2, 3, ?))FISH STAT (FISH Status (0,1))HISTO TYP (histological tumor type (1,2,3))PT (tumor size (1,2,3,4,?))PN (lymphnode status (1,2,3,?))M (metastasis (0,1,?))G (grading (1,2,3,?))REZ ER (estrogen receptor expression (0,1,?))REZ PR (progesterone receptor expression (0,1,?))Local recurrence and distant metastasis status:
AT LOK (local)AT ABD (abdominal)AT HEP (liver)AT PUL (lungs)AT ZNS (central nervous system)
Explanation Examples (short)
AT PERI (heart)AT PLEU (pleura)AT ASCI (ascites)AT LYM (lymphangiosis)AT KNO (bone)
HT AD (hormone therapy)HT PA (palliative hormone therapy)CT AD (chemotherapy)CT PA (palliative chemotherapy)ST (radiation)BT (bisphosphonates)Age (years)BEST RES (best response
(1 = complete response,2 = partial response,
3 = no change,4 = progressive disease))
TTP (time to progression (weeks))SURV (survival time (weeks))
Explanation Examples (short)
Which factors influence time to progression (TTP)?
The most important factors influencing TTP are:
∗Note: Values computed by initial method for importance fac-tors. Threshold for that method was 0.2.

Explanation Examples (short)
The data contain three patients with amazingly high TTP val-ues. Analysis of reasons produces the follow explanation:
In two situations, high TTP values are predicted:
Case 1: TP TISSUE ≥ 6 and K18 ≥ 9
Case 2: TP TISSUE ≥ 6 and COX2 ≤ 2
Explanation Examples (short)
2. Local Recurrence Versus Metastasis
Explain the difference between local recurrence and metastasisusing lab data.

15 patients (3 local recurrence, 12 metastasis)
Records have 17 attributes.

Explanation Examples (short)
PLAKOPL MPL ZPL KDESMODESMO1DE MDE ZDE K
Important Factors
Explanation Examples (short)
The difference between local recurrence and metastasis can beexplained as follows.

If K18 ≥ 3.0 and PL M ≥ 2.0,then local recurrence case.

If K18 ≤ 2.0 or PL M ≤ 1.0,then metastasis case.

Explanation Examples (short)
Explanation Examples: Cervical Cancer
The data sets used in this subsection were supplied by theFrauenklinik of the Charit´
1. Factors Influencing Time to Progression (TTP)
Records have 20 attributes:
RESP (responder (0 = no, 1 = yes))AGE (years)PARTUS (number of born children)MENOP (menopause (0 = no, 1 = yes))T (Tumor size (FIGO))N (Lymphnode metastasis (0 = no, 1 = yes))
Explanation Examples (short)
(0= poorly diff. squamous cell carcinoma,
1= moderately diff. squamous cell carcinoma,2= poorly diff. Adeno-Ca,3= SCC))
TTP (time to progression (months))SCC 0 (squamous cell carcinoma antigen, baseline)SCC 4 (after 4 weeks chemotherapy)
SVEGF 0 (SVEGF-A = vascular endothelial growth
factor A in pg/ml in serum, baseline)
SVEGF 1 (after 1st cycle 4th dose)
PVEGF 0 (PVEGF-A = vascular endothelial growth
factor A in pg/ml in Plasma, baseline)
PVEGF 1 (after 1st cycle 4th dose)
SVEGF D 0 (SVEGF-D = vascular endothelial growth
factor D in pg/ml in Serum, baseline)
SVEGF D 1 (after 1st cycle 4th dose)
Explanation Examples (short)
EGF 0 (epidermal growth factor in pg/ml, baseline)EGF 1 (after 1st cycle 4th dose)
IGF 0 (IGF = insulin like growth 1 factor in ng/ml, baseline)IGF 1 (after 1st cycle 4th dose)
Explanation Examples (short)
Which factors influence time to progression (TTP)?
Two cases: (1) Using initial data only. (2) Using initial andsubsequent data.

Both cases result in the same most-important attributes:
∗Note: Values computed by initial method for importance fac-tors. Threshold for that method was 0.2.

Explanation Examples (short)
High TTP values (TTP ≥ 39) are predicted if
MENOP = 1 and SVEGF D 0 ≥ 357
Note: We also have PARTUS ≥ 1.

Low TTP values (TTP ≤ 19) are predicted if
MENOP = 0 or SVEGF D 0 < 357
Note: In the case of SVEGF D 0 < 357, we also havePARTUS = 0.

Explanation Examples (short)
2. Difference Between FIGO I-III and Recurrence
Note: At present, treatments cannot utilize this information.

We include it here to demonstrate validation.

57 patients (31 for training, 26 for testing)
31 training cases: 19 FIGO I-III, 12 Recurrence
26 testing cases: 14 FIGO I-III, 12 Recurrence
Note: FIGO IV excluded since too few cases.

Records have 14 attributes:
Uncertainty Interval
[ 74.30 , 97.30 ]
[ 381.00 , 441.00 ]
[ 8455.00 , 9416.00 ]
Explanation Examples (short)
[ 123.00, 136.00 ]
[ 335.00 , 364.00 ]
[ 74.50 , 80.00 ]
[ 10995.00 , 11114.00 ]
[ 20.80 , 21.80 ]
[ 325.00 , 344.00 ]
[ 624.00 , 635.00 ]
[ 113.00 , 122.00 ]
[ 2552.00 , 2592.00 ]
Important Factors
Explanation Examples (short)
If ENDOSTATIN < 123.0 or M2PK PLASMA < 18.8,then FIGO I-III case.

If ENDOSTATIN > 123.0 and M2PK PLASMA > 21.8,then Recurrence case.

Accuracy of Prediction
22 of 26 cases are predicted correctly. Accuracy = 85%.

Significance of conclusion is p < 0.0002.

Explanation Examples (short)
1. K. Truemper, "Effective Logic Computation" (Wiley, 1998).

2. K. Truemper, "Design of Logic-based Intelligent Systems"
(Wiley, 2004).

Book "Data Mining and Knowledge Discovery ApproachesBased on Rule Induction Techniques", E. Triantaphyllou andG. Felici, eds., Springer Verlag, Berlin, 2006:
1. "Learning Logic Formulas and Related Error Distributions"
(coauthored, G. Felici, F. Sun, and K. Truemper).

2. "Learning to Find Context-based Spelling Errors", (co-
authored, H. Almubaid and K. Truemper).

3. "Transformation of Rational and Set Data to Logic Data"
(coauthored, S. Bartnikowski, M. Granberry, J. Mugan, andK. Truemper).

"A MINSAT approach for learning in logic domains," (coau-thored, G. Felici and K. Truemper), INFORMS Journal on
Computing 14 (2002) 20–36.

Source: http://www.cs.concordia.ca/~chvatal/truemper.pdf

International Journal of Antimicrobial Agents 23S1 (2004) S67–S74 Bacterial biofilm formation on urologic devices and heparin coating as preventive strategy Peter Tenke , Claus R. Riedl , Gwennan Ll. Jones , Gareth J. Williams , David Stickler , Elisabeth Nagy a Department of Urology, Jahn Ferenc South-Pest Hospital, H-1204 Budapest, Köves u. 2-4, Hungary b Department of Urology, Thermenklinikum Baden, Baden, Austria

European Journal of Pain 10 (2006) 185–192 Guidelines for the use of antidepressants in painful rheumatic conditions Serge Perrot *, Emmanuel Maheu, Rose-Marie Javier, Alain Eschalier, Anne Coutaux, Manuela LeBars, Philippe Bertin, Bernard Bannwarth, Richard Tre ves Cercle dÕe´tude de la douleur en rhumatologie, CEDR, Limoges, France Received 31 August 2004; accepted 11 March 2005