DSpace Community: International Journal ITA
http://hdl.handle.net/10525/5
Information Theories and ApplicationsThe Channel Imagehttp://sci-gems.math.bas.bg/jspui/retrieve/2
http://hdl.handle.net/10525/5
The Community's search engineSearch the Channelsearch
http://sci-gems.math.bas.bg/jspui/simple-search
Virtual Instruments – Functional Model, Organization and Programming Architecture
http://hdl.handle.net/10525/979
Title: Virtual Instruments – Functional Model, Organization and Programming Architecture<br/><br/>Authors: Georgiev, G.S.; Georgiev, G.T.; Stefanova, S.<br/><br/>Abstract: This paper presents functional model, organization, a programming architecture and an implementation of Virtual Instruments as an essential part of educational laboratory tools. The Virtual Instruments are designed in event- driven programming environment and are capable of performing instrumental functions in local or remote level. The possibility of realization of real time operations from signal information point of view is discussed.<br/><br/>Description: ∗ Thematic Harmonisation in Electrical and Information EngineeRing in Europe,Project Nr. 10063-CP-1-2000-1-PT-ERASMUS-ETNE.Web-based Simultaneous Equation Solver
http://hdl.handle.net/10525/978
Title: Web-based Simultaneous Equation Solver<br/><br/>Authors: Iliev, Anton; Kyurkchiev, Nikolay; Todorov, Todor<br/><br/>Abstract: In this paper we present methods, theoretical basis of algorithms, and computer tools, which we have used for constructing our Web-based equation solver.<br/><br/>Description: * This work has been supported by NIMP, University of Plovdiv under contract No MU-1.One Approach for the Optimization of Estimates Calculating Algorithms
http://hdl.handle.net/10525/977
Title: One Approach for the Optimization of Estimates Calculating Algorithms<br/><br/>Authors: Dokukin, Alexander<br/><br/>Abstract: In this article the new approach for optimization of estimations calculating algorithms is suggested. It can be used for finding the correct algorithm of minimal complexity in the context of algebraic approach for pattern recognition.Representing Autoepistemic Logic in Modal Logic
http://hdl.handle.net/10525/976
Title: Representing Autoepistemic Logic in Modal Logic<br/><br/>Authors: Brown, Frank<br/><br/>Abstract: The nonmonotonic logic called Autoepistemic Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the fixed-point equation of Autoepistemic Logic with an initial set of axioms if and only if the meaning or rather disquotation of that set of sentences is logically equivalent to a particular modal functor of the meaning of that initial set of sentences. This result is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and unlike the original Autoepistemic Logic, it is easily generalized to the case where quantified variables may be shared across the scope of modal expressions thus allowing the derivation of quantified consequences. Furthermore, this generalization properly treats such quantifiers since both the Barcan formula and its converse hold.On the Relationship between Quantified Reflective Logic and Quantified Default Logic
http://hdl.handle.net/10525/975
Title: On the Relationship between Quantified Reflective Logic and Quantified Default Logic<br/><br/>Authors: Brown, Frank<br/><br/>Abstract: Reflective Logic and Default Logic are both generalized so as to allow universally quantified variables to cross modal scopes whereby the Barcan formula and its converse hold. This is done by representing both the fixed-point equation for Reflective Logic and the fixed-point equation for Default both as necessary equivalences in the Modal Quantificational Logic Z. and then inserting universal quantifiers before the defaults. The two resulting systems, called Quantified Reflective Logic and Quantified Default Logic, are then compared by deriving metatheorems of Z that express their relationships. The main result is to show that every solution to the equivalence for Quantified Default Logic is a strongly grounded solution to the equivalence for Quantified Reflective Logic. It is further shown that Quantified Reflective Logic and Quantified Default Logic have exactly the same solutions when no default has an entailment condition.Representing Default Logic in Modal Logic
http://hdl.handle.net/10525/974
Title: Representing Default Logic in Modal Logic<br/><br/>Authors: Brown, Frank<br/><br/>Abstract: The nonmonotonic logic called Default Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the fixed-point equation of Default Logic with an initial set of axioms and defaults if and only if the meaning or rather disquotation of that set of sentences is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This result is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original Default Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults thus allowing such defaults to produce quantified consequences. Furthermore, this generalization properly treats such quantifiers since both the Barcan Formula and its converse hold.Representing Reflective Logic in Modal Logic
http://hdl.handle.net/10525/973
Title: Representing Reflective Logic in Modal Logic<br/><br/>Authors: Brown, Frank<br/><br/>Abstract: The nonmonotonic logic called Reflective Logic is shown to be representable in a monotonic Modal Quantificational Logic whose modal laws are stronger than S5. Specifically, it is proven that a set of sentences of First Order Logic is a fixed-point of the fixed-point equation of Reflective Logic with an initial set of axioms and defaults if and only if the meaning of that set of sentences is logically equivalent to a particular modal functor of the meanings of that initial set of sentences and of the sentences in those defaults. This result is important because the modal representation allows the use of powerful automatic deduction systems for Modal Logic and because unlike the original Reflective Logic, it is easily generalized to the case where quantified variables may be shared across the scope of the components of the defaults thus allowing such defaults to produce quantified consequences. Furthermore, this generalization properly treats such quantifiers since all the laws of First Order Logic hold and since both the Barcan Formula and its converse hold.Automatic Translation of MSC Diagrams into Petri Nets
http://hdl.handle.net/10525/972
Title: Automatic Translation of MSC Diagrams into Petri Nets<br/><br/>Authors: Kryvyy, Sergiy; Matvyeyeva, Lyudmila; Lopatina, Mariya<br/><br/>Abstract: Development-engineers use in their work languages intended for software or hardware systems design, and test engineers utilize languages effective in verification, analysis of the systems properties and testing. Automatic interfaces between languages of these kinds are necessary in order to avoid ambiguous understanding of specification of models of the systems and inconsistencies in the initial requirements for the systems development. Algorithm of automatic translation of MSC (Message Sequence Chart) diagrams compliant with MSC’2000 standard into Petri Nets is suggested in this paper. Each input MSC diagram is translated into Petri Net (PN), obtained PNs are sequentially composed in order to synthesize a whole system in one final combined PN. The principle of such composition is defined through the basic element of MSC language — conditions. While translating reference table is developed for maintenance of consistent coordination between the input system’s descriptions in MSC language and in PN format. This table is necessary to present the results of analysis and verification on PN in suitable for the development-engineer format of MSC diagrams. The proof of algorithm correctness is based on the use of process algebra ACP. The most significant feature of the given algorithm is the way of handling of conditions. The direction for future work is the development of integral, partially or completely automated technological process, which will allow designing system, testing and verifying its various properties in the one frame.A Gradient-Type Optimization Technique for the Optimal Control for Schrodinger Equations
http://hdl.handle.net/10525/971
Title: A Gradient-Type Optimization Technique for the Optimal Control for Schrodinger Equations<br/><br/>Authors: Farag, M.<br/><br/>Abstract: In this paper, we are considered with the optimal control of a schrodinger equation. Based on the formulation for the variation of the cost functional, a gradient-type optimization technique utilizing the finite difference method is then developed to solve the constrained optimization problem. Finally, a numerical example is given and the results show that the method of solution is robust.On Statistical Hypothesis Testing via Simulation Method
http://hdl.handle.net/10525/970
Title: On Statistical Hypothesis Testing via Simulation Method<br/><br/>Authors: Dimitrov, B.; Green, D.; Rykov, V.; Stanchev, Peter<br/><br/>Abstract: A procedure for calculating critical level and power of likelihood ratio test, based on a Monte-Carlo simulation method is proposed. General principles of software building for its realization are given. Some examples of its application are shown.Frontal Solutions: an Information Technology Transfer to Abstract Mathematics
http://hdl.handle.net/10525/969
Title: Frontal Solutions: an Information Technology Transfer to Abstract Mathematics<br/><br/>Authors: Jotsov, Vladimir<br/><br/>Abstract: The paper introduces a method for dependencies discovery during human-machine interaction. It is based on an analysis of numerical data sets in knowledge-poor environments. The driven procedures are independent and they interact on a competitive principle. The research focuses on seven of them. The application is in Number Theory.The Subclassing Anomaly in Compiler Evolution
http://hdl.handle.net/10525/968
Title: The Subclassing Anomaly in Compiler Evolution<br/><br/>Authors: Radenski, Atanas<br/><br/>Abstract: Subclassing in collections of related classes may require re-implementation of otherwise valid classes just because they utilize outdated parent classes, a phenomenon that is referred to as the subclassing anomaly. The subclassing anomaly is a serious problem since it can void the benefits of code reuse altogether. This paper offers an analysis of the subclassing anomaly in an evolving object-oriented compiler. The paper also outlines a solution for the subclassing anomaly that is based on alternative code reuse mechanism, named class overriding.Admissible Substitutions in Sequent Calculi
http://hdl.handle.net/10525/967
Title: Admissible Substitutions in Sequent Calculi<br/><br/>Authors: Lyaletski, Alexander<br/><br/>Abstract: For first-order classical logic a new notion of admissible substitution is defined. This notion allows optimizing the procedure of the application of quantifier rules when logical inference search is made in sequent calculi. Our objective is to show that such a computer-oriented sequent technique may be created that does not require a preliminary skolemization of initial formulas and that is efficiently comparable with methods exploiting the skolemization. Some results on its soundness and completeness are given.Systems Analysis: the Structure-and-Purpose Approach Based on Logic-linguistic Formalisation
http://hdl.handle.net/10525/966
Title: Systems Analysis: the Structure-and-Purpose Approach Based on Logic-linguistic Formalisation<br/><br/>Authors: Lukiyanova, Lyudmila<br/><br/>Abstract: Systems analysis (SA) is widely used in complex and vague problem solving. Initial stages of SA are analysis of problems and purposes to obtain problems/purposes of smaller complexity and vagueness that are combined into hierarchical structures of problems(SP)/purposes(PS). Managers have to be sure the PS and the purpose realizing system (PRS) that can achieve the PS-purposes are adequate to the problem to be solved. However, usually SP/PS are not substantiated well enough, because their development is based on a collective expertise in which logic of natural language and expert estimation methods are used. That is why scientific foundations of SA are not supposed to have been completely formed. The structure-and-purpose approach to SA based on a logic-and-linguistic simulation of problems/purposes analysis is a step towards formalization of the initial stages of SA to improve adequacy of their results, and also towards increasing quality of SA as a whole. Managers of industrial organizing systems using the approach eliminate logical errors in SP/PS at early stages of planning and so they will be able to find better decisions of complex and vague problems.The Hough Transform and Uncertainity
http://hdl.handle.net/10525/965
Title: The Hough Transform and Uncertainity<br/><br/>Authors: Donchenko, Volodymyr<br/><br/>Abstract: The paper deals with the generalisations of the Hough Transform making it the mean for analysing uncertainty. Some results related Hough Transform for Euclidean spaces are represented. These latter use the powerful means of the Generalised Inverse for description the Transform by itself as well as its Accumulator Function.Defining Interestigness for Association Rules
http://hdl.handle.net/10525/964
Title: Defining Interestigness for Association Rules<br/><br/>Authors: Brijs, Tom; Vanhoof, Koen; Wets, Geert<br/><br/>Abstract: Interestingness in Association Rules has been a major topic of research in the past decade. The reason is that the strength of association rules, i.e. its ability to discover ALL patterns given some thresholds on support and confidence, is also its weakness. Indeed, a typical association rules analysis on real data often results in hundreds or thousands of patterns creating a data mining problem of the second order. In other words, it is not straightforward to determine which of those rules are interesting for the end-user. This paper provides an overview of some existing measures of interestingness and we will comment on their properties. In general, interestingness measures can be divided into objective and subjective measures. Objective measures tend to express interestingness by means of statistical or mathematical criteria, whereas subjective measures of interestingness aim at capturing more practical criteria that should be taken into account, such as unexpectedness or actionability of rules. This paper only focusses on objective measures of interestingness.