Monday, June 3, 2019
Software Architecture Design Approach
softw atomic number 18 computer architecture architectural plan onslaughtRizwan Umaid Ali1 Generate and Test as a software program architecture Design Approach1.1 About the WriterLen Bass from the Software Engineering Institute, CMU. Published in European Conference on Software architecture 2009.1.2 inletSoftware Architecture soma has be roll in the hay a fundamental component of package development life cycle. As other components of life cycle scrutinying the design of the architecture is important and relates directly to overall spirit of the Software Application.1.3 line of workTo make a Software Architecture a design decision edge that flock test the design hypothesis, test quality of it and identify issues and rank them on the basis of priority. The process result develop test case on individually step of design process. This will result a sequential process in which each design will be developed and tested and therefore improving the overall design quality of software system.1.4 Design HypothesisMost designs are created in the context of an existing system, even it is created from scratch and non being modified. Consider this our initial hypothesis tummy come from succeeding(a) sourcesThe system we will modify or the new usageality we will add.A functionally similar system.A framework designed to provide services which will help in design process.A collection of legacy/open-source applications.1.5 usher Test CasesAfter we moderate our initial hypothesis we have to determine how to identify if design satisfies the quality benchmark expected from the application. For this we have to establish test cases and identify three sources for it.Identify perspectives which can be use to generate test cases.Identify architecturally significant requirements.View specific use cases. A number of use cases can be derived by thinking about specific architectural views.1.6 Test ProcedureHaving the test cases of design hypothesis, following methods can be used to test the design and detect its nigglingcomings.Analytic models using quality attributes.Develop simulations of how design will support the test cases.Create exemplification of initial design. Needs more effort but gives best result.1.7 Test Result and Next HypothesisThe test result will either press out that the design hypothesis passes all tests and fulfills the quality requirement or there are shortcomings. The quality attributes these shortcomings relate to should be identified first. We can use dickens approaches to alter the design. harbour architectural patterns to problems detected.Use architectural tactics to address for specific quality attributes.The updated/next hypothesis will go through the to a higher place process recursively until the design with required quality is achieved or the time allocated for the design process runs out.1.8 ConclusionThis paper presents a software architecture design process where we will test, validate and update our desi gn until it reaches the quality benchmark.The architect of the software system can use this process to identify shortcomings and make decisions for alternative design structures.2 SecArch Architecture-level Evaluation and Testing for Security2.1 About the WriterSarah Al-Azzani and Rami Bahsoon from University of Birmingham. Published in Software Architecture (WICSA) and European Conference on Software Architecture (ECSA) in 2012.2.2 IntroductionSoftware architecture models or views are evaluated for detection problems early in the software development lifecycle. We can detect critical trade protection vulnerabilities at this stage and get a chance to improve quality at a very low cost. This paper presents methodology for detecting certificate vulnerabilities caused by implied scenarios and race conditions.2.3 ProblemIncorporating multiple views of an architecture and studying the communications mingled with them and give ways analyze security concerns in concurrent systems. This will done by comparison between complete vs incomplete system models using ii methods,one for detecting implied scenarios using behaviour models,and one for detecting race conditions using scenario diagrams.2.4 Scenario-based specificationsScenario-based specifications are based on procedural-flow through components. Each scenario explains a partial view of the concurrent system. The scenario-based model will have following three propertiesthe composition of scenarios from multiple component views of the software system,the possible continuations between multiple scenario andthe hidden implied scenarios.2.5 Implied ScenariosImplied scenarios can be formed my dynamically combining two different scenarios together and provide an architectural flow for them is state representation. down the stairs is an example of behavior model which is combining two different scenarios together. It uses an incremental algorithmic program for detecting inconsistent implied scenarios from sequence mo dels. physique 1 behavior model example2.6 Detecting Race ConditionsWe can apply race condition scenarios to to a higher place model and identify security vulnerabilities. Below are the 3 possible cases. Race Condition 1 disabling the server during authentication. Race Condition 2 what happens when the user commits to buy an item while the server is being disabled. Race Condition 3 what happens when the server is disabled while the user is logging off.Below are sequence diagrams for these three race conditions.Figure 2 Race Conditions2.7 ConclusionThis paper presented an incremental architecture evaluation method that merges behavior models with structural analysis for improved detection of inconsistencies. We examined the concept of implied scenarios and detection of race conditions.The writer also compared his proposed method with current industry practices and tested the on industry projects. He found that his method can give better results. The future work will focus on genera ting test cases to perform live testing on the system under test.3 Towards a Generic Architecture for Multi-Level Modeling3.1 About the WriterThomas Aschauer, Gerd Dauenhauer, Wolfgang Pree from University of Salzburg. Published in European Conference on Software Architecture 2009.3.2 IntroductionSoftware architecture modeling frameworks are essential for representing architecture and their views and the viewpoints they are derived from.Conventional modeling approaches like UML do not have qualified complexity to explain the models and meta-models (defining the models) of architecture.3.3 ProblemGeneral purpose meta-models are used in the ceremonious modeling techniques, which are not sufficient for modern software models. Model driven architecture has to use more generic approach to describe multilevel architecture.3.4 model-driven design and parameter generationModel-driven engineering (MDE) is method for managing complexities of developing large software intensive systems. Th e models in MDE are the main artifacts describing a system going under design process. This paper aims at developing a framework for model-driven generation of automation system configuration parameters using a testbed platform.The configuration parameters for the automation system can be generated automatically when a testbed model includes hardware and software components.Figure 3 Testbed configuration MDE3.5 Presented Prototypical implementationThe below example explain the modeling approach presented in this paper.Component is an example of the fixed meta-model particles represented as code in the environment. Different types of engines can instanter be either initiated using the Component, or by cloning the initial Engine and copying t to new engine.In the example, the Engine has two attributes, Inertia and MaxSpeed. In prototypical approach each element is an instance and must provide values to these attributes. Diesel and Otto represent two kinds of engines since they are cloned from Engine, they receive copies of the attributes Inertia and MaxSpeed, as hearty as their values. Italics script is used to mark such copied attributes grey text is used to express that the attribute values are kept unchanged.Figure 4 Meta-models exampleIn Figure 4 DType represents a family of diesel engines. D1 finally is a concrete, physically existing member.3.6 ConclusionThis paper we presented applications of multi-level modeling in the domain of testbed automation systems and why conventional modeling is insufficient for our MDE requirements and how multi-level modeling can solve the representation issues. They presented an approach to represent models in much more detail with simple notations.4 automate reliability prediction from formal architectural descriptions4.1 About the WriterJo ao M. Franco, Raul Barbosa and M ario Zenha-Rela University of Coimbra, Portugal. Published in Software Architecture (WICSA) and European Conference on Software Architecture (ECSA) i n 2012.4.2 IntroductionAssessment of quality attributes (i.e., non-functional requirements, such as performance, safety or reliability) of software architectures during design phase so early decisions are validated and the quality requirements are achieved.4.3 ProblemThese quality requirements are most often manually checked, which is time consuming and error-prone payable to the overwhelmingly complexity of designs.A new approach to assess the reliability of software architectures. It consists in extracting and validating a Markov model from the system specification written in an Architecture Description Language (ADL).4.4 Reliability vaticination ProcessThere are many different methods to achieve reliability prediction are known, each targeting various(a) failure behaviours and different reliability assessment methods. The writer presented the below process for reliability prediction.Architecture and Module identification and their interactions.The Probability of Failure specif ied in terms of a percentage.Combining the architecture with the failure behaviour. Below is an example of batch sequential style state model using the Marov model.Figure 5 Markov model exampleValidation of the ProcessThe validation of the process presented by the writer was done in two stepsValidity of Reliability PredictionValidity with different architectural styles.The validations were compared to previous research studies. It was found that results were similar proving that the mathematical models were accurate.5 In Search of a Metric for Managing Architectural adept Debt5.1 About the WriterRobert L. Nord and Ipek Ozkaya from the Software Engineering Institute, CMU. Published in European Conference on Software Architecture 2009.5.2 IntroductionThe good debt is trade-off between short-term and long-term value. Taking shortcuts to optimize the delivery of features in the short term incurs debt, analogous to financial debt, that must be paid off subsequently to optimize long-te rm success. This paper demonstrates a architecture focused and measurement based approach to calculate technical debt by describing an application under development.5.3 ProblemTechnical debt thoroughly relays on system evaluation. An organization which has to evolve its system has to make sure if future development will not increase its debt and have a lower cost. In this paper the writer develops a metric that assists in strategically managing technical debt.5.4 Architecture Debt AnalysisWe will analyze technical debt on two different paths. Both paths have different priorities.Path 1 Deliver soon.To deliver a working version of the system quickly, the platform calls for making the minimum required effort at the beginning.Path 2 Reduce rework and enable compatibility.Requires an investment in infrastructure during the first deliveries. toll compression of both paths is illustrated in the table below.Table 1 Cost ComparisonWe can calculate the total cost T with a function taking i mplementation cost and rework cost as input.T = F( Ci, Cr)For simplicity we consider the function sums both the cost up only. We can now compare the total cost with the cumulative cost.Table 2 Cost comparison with cumulative cost5.5 Modeling ReworkIn Agile software development an important challenge is to give value to long term goals then short term. The cost of taking an architectural design decision at once always has a lower cost than refactoring the design in future implementations.An organization should have the following prospective towards its technical debt.Focusing on short term goals puts the organization technical jeopardy, when the debt cannot be further handled.Using shortcuts can give success on short term until the rework cost starts to come and the cost and timeline becomes unmanageable.The architectural decisions requires active follow-ups and continuous cost analysis. This is to make sure that the design decision will make an impact in future costs of development. 5.6 ConclusionFrom this research we conclude that the future development of well-designed application has lower cost and is less tentative. Therefore the technical debt in lower if the architecture is well defined and fulfills quality attributes requirement.6 Research Topic Testing Software Architectural Changes and adapting best practices to achieve highest quality in a quantifiable manner.6.1 IntroductionWe have looked into testing methodologies and design process and possible technical debt on software architecture. We now look how our technical debt will be effected if due t future requirements the architecture have to be changed.6.2 Proposed Research ProblemWe will first Estimating Technical debt onExistingSoftware architecture and Software system. Then using Design changes and code changes for estimating technical debt and quality attributes. The prediction is made based on comparisons with similar change bursts that occurred in the Architecture. The views of software architec ture will be used. This is relevant in Agile Development.6.3 Types of changesWe can classify each type of change in architecture by analyzing the overall impact of it on the architecture and possibilities of technical debt from it. We also assign a propagation value to each type of debt so that its estimated suavity can be quantified.Small architectural change in one or some views.Low Technical Debt increase (0.10)Addition of new architecture. Architecture for new functionality added.Medium Technical Debt increase (0.30)Small changes in several views.High Technical Debt increase (0.60)Massive architectural change is several views.High Technical Debt increase (0.80)6.4 Proposed SolutionAfter analyzing research document and book Software Architecture in Practice, I can give following points on how the technical debt of new architecture can be managed.Compare updated architecture and see how the updates have increased the technical debt.Apply same test cases which were used in the in itial software architecture.See how quality attributes are increased or decreased after the update.6.5 Reduction of Technical DebtTo reduce the technical debt after architectural changes following strategies can be adopted.6.5.1 RefactoringApply architectural patterns to improve several quality attributes.Use architectural tactics to address for specific quality attributes.6.5.2 Retaining existing Architecture ModelsContinue the existing architecture in patterns.Search for Modifiability tactics already used. Stick to that tactics.7 References1 Len Bass Generate and test as a software architecture design approach. WICSA/ECSA 2009 scallywag 309 312.2 Sarah Al-Azzani and Rami Bahsoon. SecArch Architecture-level Evaluation and Testing for Security. In 2012 Joint Working IEEE/IFIP Conference on Software Architecture (WICSA) and European Conference on Software Architecture (ECSA), pages 51 60, Aug. 2012.3 Thomas Aschauer, Gerd Dauenhauer, Wolfgang Pree. Towards a Generic Architecture f or Multi-Level Modeling. European Conference on Software Architecture 2009 Page 121 130.4 J. Franco, R. Barbosa, and M. Zenha-Rela. Automated reliability prediction from formal architectural descriptions. In 2012 Joint Working IEEE/IFIP Conference on Software Architecture (WICSA) and European Conference on Software Architecture (ECSA), pages 302 -309, Aug. 2012.5 R. Nord, I. Ozkaya, P. Kruchten, and M. Gonzalez-Rojas, In search of a metric for managing architectural technical debt, in 2012 Joint Working IEEE/IFIP Conference on Software Architecture and 6th European Conference on Software Architecture, 2012, pp. 91-100.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.