This paper briefly surveys the intellectual context of the lecture "The Office of Justifying Faith" by John Henry Newman. Newman is an outstanding English theologian, writer and philosopher, who had a great influence on the development of the Anglican and Roman Catholic churches in the 19th and 20th centuries. Newman became one of the leaders of the so-called Tractarian Movement. Tractarians offered a radically new understanding of the relation between the Anglican Church and other ecumenical churches. The most famous expression of this rethinking was the branch theory. According to this theory, the Anglican Church is one of the three branches of a single universal Church alongside the Roman Catholic and Orthodox churches. According to the Newman’s doctrine of via media, Anglican Church represents the middle way between Roman Catholic Church and Protestant Churches. In the «Lectures on Justification» published in 1838, Newman applies the doctrine of via media to the set of problems that concern the Christian idea of justification. The lecture "The Office of Justifying Faith" is in this sense a very characteristic example of the doctrine of via media. Here Newman demonstrates the difference between the doctrines of the Anglican Church about the relationship between justification and faith and the doctrine of the Protestants about justification by faith only.
The article presents an overview of the most popular interpretations of Moore’s defence of common sense. His antiskepticist argumentation is presented in the light of the new ideas of epistemological minimalism and reliabilism. Possible uses of his strategy against the arguments of the so-called «new skeptics» are discussed.
Normative dualism between descriptions of the mental and the physical is still a problem for many philosophers that provokes more and more attempts to justify it, or, on the contrary, to overcome it by means of reduction. The problem of a special normative status of mental states is usually considered in isolation from the concept of truth. Moreover, the definition of truth is often construed only as a part of the problem of normativity: in this case, truth is only a kind of norm, for example, a goal of scientific research. Donald Davidson, however, believed that truth is not the norm and that, on the contrary, norms are possible only through the use of the primitive and original concept of truth already available to us. In this paper, we propose that if one develops an idea of such a conceptual dependence between truth and norms in a certain way, then it will become possible to solve the problem of a normative gap between our descriptions of the mental and the physical. In other words, if the assimilation of the concept of truth precedes the learning of norms pertaining to the mental and the physical, then the solution for the problem of the gap between these norms can be directly related to conditions and differences in the use of the notion of truth.
The article considers pros and cons for a theoretic-measurement analogy, proposed by some philosophers as an illustration of semantic indeterminacy. Within this analogy ascribing of meanings to a certain linguistic expressions is compared with attribution of numbers according to a certain theory of measurement. Donald Davidson used this analogy in order to extend W. V. O. Quine's thesis of indeterminacy of translation to the interpretation of all human behavior. In other words, not only linguistic meanings, but all mental states are considered as indeterminate. The article explains the failure of some counter-arguments put forward against Davidson’s use of this analogy and against the thesis of indeterminacy on the whole. Particularly, instrumentalist version of the analogy is rejected for there is no direct relation between indeterminacy and underdetermination of theories by empirical evidences. The article concludes that the semantic indeterminacy is largely based on indeterminacy of rationality.
This paper is devoted to the analysis of indeterministic models of causation in the analytic philosophy of action. At the beginning of this article, I deal with the most common in contemporary debates indeterministic theories authored by Robert Kane, Daniel Dennett, Alfred Mele and Laura Ekstrom. After critical examination of given theories and finding what in my opinion is their main difficulty I provide an alternative account of action. The approach seeks to integrate rationality inside the action itself and not on the specific causal chain preceding it as it became common since famous article of Donald Davidson “Actions, reasons and causes”. I propose to consider action as onotological entity consisting of three parts: 1) goal; 2) mean(s) to achieve goal; 3) teleological relation “for” between 1 and 2. Therefore, “Jim moved his arm to take a cup of tea” has 3 parts 1) taking a cup of teal; 2) Moving Jim’s arm; 3) “for” between 1 and 2. This account of rational action as I expect would escape some long-living paradoxes of causal theory of action. Finally I provide the concept of non-phenomenal will trying to explain some further properties of agency such as its “active” character and to close an explanatory gap between merely thinking about doing something and actually doing. This non-phenomenal will nevertheless is not meant as something necessarily transcendent and extra-natural, on the contrary it could match naturalistic point of view provided by neuroscience.
There are two problem areas associated with modern studies of philosophy of mind focusing on identification and convergence of human and machine intelligence. One problem is machine simulation of meaning and the other – machine simulation of sense. Today it is a widely accepted opinion that the concept of the so-called “strong AI”, where the presence of consciousness is regarded as an indicator of an implemented program, has little chances to be proven. This tactic would have a greater effect if a non-classical (structural and post-structural) theory of sign, meaning, sense and language in general could also be engaged. The point is that if we consider irreducibility of semantic augment to the configuration of the semantic increment values and the difficulty of sense formalization in general, then we could show the consistent nature of failed attempts to create computer models capable of processing information in attempt to simulate functions of human psyche and explain along those lines the causes and mechanisms of such failures. In the present study the analysis of the stated problems is carried out based on the concepts of structural and post-structural linguistics almost entirely ignored by philosophy of mind.