Email me if you are interested in free online Catia Training
by: Ray Anderson
Knowledge Architecture A Fundamental Approach
KBE is the automation of tasks based on known facts (Intellectual Property) and relationships of these facts and processes. This automation delivers predictable and repeatable results, thus Increasing Quality and Accelerating Workflow.
The primary source of all data for Knowledge Engineering is the Subject Matter Expert (SME). These experts must be exposed to a culture of peer-to-peer discussion where they must have the maturity and presence of mind to argue their points of experience logically and constructively. No single point of reference suffices an SME opinion worthy of being entered into a Knowledge Database. These data points must be thoroughly critiqued and corroberated by not only experience, but recorded evidence of their validity.
The data points primarily contributed by SME's are Methods, Heuristics, and Proofs. Briefly, Methodsrefers to a procedure or sequence of events that is deemed the most efficient way of performing a given task. When these methods are found to split into derivatives, each branch must be completely and accurately defined as what is commonly known as options. At the top of each derivative branch there should be a set of contributing attributes that are capable of delineating the decision criteria for these branches. If branches are found to merge near or at the following output, an opportunity for standardization and workflow compression has emerged.
Heuristics, by definition are also called …Rule of Thumb. Most cultures embrace heuristics as a way to expedite decision making and thus reduce resource requirements. However, unconditional heuristics would have left us with stone wheels rather than pneumatic tires. Heuristic attitudes toward problem solving are the enemy of innovation and competition, which is what we as corporations are known to thrive on. A competitive Knowledge Based Engineering (KBE) system must force SMEs to compete with one another and even corporate competitors, to cause a differentiation of KBE thought and culture. This differentiation is what will cause growth and opportunity for both the individual and corporation.
Proofs are data points that are based on target values and physical laws and relationships that do not change beyond accepted ranges for any reason. Examples of proofs are Gravity, the relationship between Atomic Weights and Gravity to derive Mass, or the targeted Six Sigma range for a given process result. As you might imagine, Proofs are the most tightly controlled and non-negotiable data points in a Knowledge Management System (KMS).
Within a KMS, all of the data points must be organized via attributes, so that an SQL (Structured Query Language) search can be performed on multiple attributes. The need for multiple attributes is evidenced by attempts to find data points that are not completely described. SQL allows the user to find the perverbial “needle in a haystack by searching for attributes such as “Steel, “1.25 inch, “Thread and “Sharp. The resultant feedback could still yield a knife blade that belongs to a small knife, except that the “Thread" attribute has effectively refined the search. I used this example to demonstrate that most SQL searches are only effective if the person that defined the attributes was thinking one step beyond the obvious description for a needle. If the shank of the blade had been threaded for attachment to its handle, further refinement of the needle attributes might be required. Such refinements can be captured real-time by the SQL itself, in learning systems. By not accepting more than one result, a Learning System (LS) could record which of the results you chose, and prompt for the criteria by which you chose it, such as “.8mm Diameter. After edition of the SQL, the LS would retrace the SQL and show you the “Needle" as the only result possible. Subsequent queries can now present you with all possible attribute options in a “Wizard" format.
The attributes common to a KMS that support KBE are Manufacturing Rules and Ranges, Cost Abatement Criteria, Six Sigma Measurements on Parts and Processes, Part and Assembly Functionality, Package Containment and Space Claim Requirements, Attachment Methods, and Aesthetic Theme. Each of these attribute tables must be defined on every data point, so that the SME is reminded to think about the relationships between them. When relationships are found, links can be built from one data point to another, defining a nueral network construct, that will result in a more robust understanding of the task or decision at hand.
As you can see, the need for expertise in Database Management and Data Point Entry is critical to the success of Knowledge Management. Due to this need, Process Owners (PO) must be defined, and will need to be empowered to interact with the Subject Matter Experts in order to objectively measure the effects of Design and Manufacturing Process changes. Process improvements cannot be accurately or safely measured without a full understanding of the expected impacts. Every opportunity for improvement must be assessed, tested and measured individually, to avoid noise in the measures.
As each process change is shown to generate an improvement, requisite changes will be found, and will need to be weighed against the expected benefits of the process change as a whole. This stage of the analysis will hone both the skills of the SMEs and POs as well as require retraining of personnel and documentation edition. Once the process has been improved, and the expected benefits are realized, it is necessary to study automation.
In this stage, opportunities for automation will often emerge on their own, but must also be carefully considered. Impact assessments will reveal the amount of time saved by automation if and only if a complete prototype code can be generated. This code does not have to be completely stable, but must follow the process in order to validate its efficiency. Often, in the automation stage, a programmer will discover more opportunity for improvement. These opportunities must not be integrated without the SMEs and POs studying the process as a whole to verify that critical tasks are not being compromised.
In this stage, the SMEs perform another task. It now becomes their responsibility to populate the database with data points and hueristics, on a daily basis. It is also the point at which supporting logic and methods are entered seperately, and linked to their related DB content. At this time, more opportunities for automation and DB Architecture improvements will emerge. These are also considered process changes, and must be reviewed as such via an Impact Assessment.
This stage of Knowledge Engineering is the point at which the Computer Aided Design (CAD), Computer Aided Machining (CAM), and Product Data Management (PDM) Systems should become connected to the vast vault of the Knowledge Database. This is where the validity of the data and its format of Attributes, Queries and Outputs culminate in real, measurable value. Applications, Interfaces (GUIs), and Reports become the focus at this stage.
Examples are the linkage of the CAD System(s) to the KDB, in order to run process compatability checks, populate parametric fields, generate relationships between parameters, provide search engine capabilities with DB connection to results, and report generation.
CAM linkage to the KDB provides machinability checks, tooling compatability checks, G-Code subroutine retrieval and reuse, Six Sigma feature checks, and report generation.
PDM linkage to the KDB provides an understanding of what components may already exist to satisfy the design requirements, what processes are related to this component, what costs are associated, and what plant capacities are associated.
This is the stage at which a complete statement of work (SOW) is prepared for the DB change requirements and Automation Code. In both cases, the required changes and code generation have already been tried and proven as prototypes against a Beta KDB, CAD, CAM, and PDM system. The need for Beta environments has been proven a necessity for many years. No software development should ever occur on Production Environments or Data, due to the risks associated with experimentation, such as securities, inwork deliverables, stability, and the lurking unknown.
The development of Automation Code must be closely monitored by a Project Manager. In most cases, the Process Owner should serve as the Project Manager, since the success or failure of the project directly affects their workflow. Important components of the SOW are the Source Code ownership, Documentation, Roles and Responsibilities, and accurate records of Milestone Performance. Also, the SOW should state that each time a new change in capability within the DB or GUI is required, it must be weighed against a prenegotiated change tolerance and cost model prior to implementation.
When the value of these Automations and DB changes are being measured, they must be considered as a component of a larger plan. Their value assessment may show that in and of themselves there is little value, or even a deficit, but may be a prerequisite for a long term Automation or DB capability that far outweighs the shorter term deficit. This requires the foresight and budgetary tolerance of what I call “Cost Of Glory (COG). This is usually the missing COG in the financial machine that almost always requires an immediate Return On Investment (ROI).
The fallacy in short term strategies can best be compared to raising children, when you consider that Knowledge Management is an attempt to feed enough relevant data into a “Brain" of child processes to feed “Thought" in the form of SQL, Linkages and Automations to “Execute" and achieve results that when“Measured", prove to be both valid and efficient. The difference is that while we understand child-rearing as a 18-25 year process, Knowledge Management and Engineering is often understood to be a 2-5 year process, at which time the success is measured while it is still in its infancy, or even in vitro, if resources have been restricted or diminished. Decisions based on this early measure will usually result in a Board of Director decision to eliminate the waste in order to maximize profits in the short term.
There must be a completely unanimous decision by the Board of Directors to commit to a long-term strategy of growing a completely capable KDB System, and stability of direction for the term of that commitment in order to succeed. The Board must ultimately be the doting Grandparent of this child, for it to be successful in its enormous endeavor of becoming smarter and more successful than all of its competitors. Also, to hold this status, a continual process of improvement and reinvention must be considered when new methods emerge from the rapid technological evolution that is continually accelerating.
Machine Assisted Intelligence
In summary, I leave you with a thought provoking theory: The name that I have given the concept of a fully developed KMS/KBE/PDM System is Machine Assisted Intelligence (MAI), and includes a component that I have not even discussed in this article. That component is Derivative Decision Management (DDM), and is based on a nueral network thought process. MAI is completely devoid of emotion, but inclusive of social and economic impact assessment, as well as all of the data points considered by the KDB in the generation of components and assemblies.
First do no harm
Have you ever wondered if your Dr. remembers that part of their oath? Well, if you haven't, you have chosen the proper physician for your ailment. What I hope to do for you is to either properly diagnose your ailments or reexamine your perception of ailments as they relate to Catia. The purpose of these articles will not be to criticize hardware, software or even the management of these tools. I will focus my efforts on the accurate diagnosis of your aches and pains. Then we will focus our efforts on a better understanding in the use of the enormous black bag of Catia.
It is my understanding that there have been many questions regarding the proper use of Catia Version 4s Advanced Surfacing tools. These questions range from the basic understanding of the terminology, and tools to the ethereal practices of the styling gurus. In this first article, I will explain the terminology involved in the production of lofted surfaces. These terms will transfer from the almighty SURF2 into the Next Generation functionality of the SKIN function and the V5 Generative Surface Design tools. Speaking of V5 Generative Surfacing. This is the most awesome display of mathematical genius that I have ever seen in CAD. This is not a paid advertisement, so I will move on. We Doctors don't know the meaning of the term Pro-Bono.
Imagine yourself on the beach, with a cool salty breeze wafting lightly across your face. Your tan is deepening, and the gulls are drifting in search of the next clueless crab. The smell of coconut oil permeates your every thought, as you wake to the gentle voice that says You've Got Mail! You rise from your chaise to meander to the table that is shaded by a huge umbrella to see what wonderful surprise awaits you onboard your palmtop. Your shades spring to life with the radio transmitted image from the palm, and you see that your exquisite design has undergone the scrutiny of those tie wearing, hypertensive, clueless beings called BOSS. Remembering that they were so kind as to allow you to take the day off, provided you stay in touch, you smile and say "No Problem" with such enthusiasm as to awake your significant other.
You are now racing to the office in eager anticipation of the changes that have been emitted by the Design Review Team, and your machine has booted and fired Catia since the time you left your car with the underpaid Valet. As you settle into your overstuffed ergonaturelle design station, the wall springs to life with your favorite, hand picked design review team. Each of the members address you with the utmost professionalism and respect due a genius such as yourself, and enthusiastically praise the brilliance of your design. With apology and fervor, the team asks quietly for a favor. The customer has asked that the XRV4 be given a different flavor and style by adding a new surface that has been suggested by their marketing department. The customer has lofted some wireframe elements that suggest a post-depression era revision of your otherwise perfect design. With a sad heart, you agree that the customer is always right, right?
When I snap my fingers, you will WAKE UP!!!!!
While you were sleeping, we received some new loft curves for the XRV4. These new curves need to be used to generate a cover for the product. Your problem is that you don't know how you got that first surface to work, but it looked OK, so the design was submitted. I am going to define a process for you to drive this new surface, and explain the nomenclature in terms that we can all understand.
First of all, check to see if the new sections are all planar. If they are not, make it so. Second, take the final section at each end and duplicate it beyond the part edge. This will help you design a slab that terminates outside of the finished region of the part. By doing so, you are eliminating some adverse affects of net termination.
Now, let's take a look at some terminology.
Given a set of planar section cuts through the product, you may add any intermediate sections that you require to further define it. Be careful not to over constrain the product, so that the surface will have room to flow from section to section. These sections are called GENERATING CURVES.
Once the G Curves have been established as curvature continuous and tangency continuous, your next task is to define the SPINE. The SPINE is a curve that flows perpendicular to each G Curve plane. For this reason, you will have to create a plane through each G Curve and a limit point on the first plane on either end of the product. Once these have been established, use CURVE2+SPINE to select the point, and then each plane in sequence. A rule to understand about the Spine is that any definite section (G Curve) that you must obtain on the drawing has to be built in a plane perpendicular to the spine, and used to generate the surface. For this reason, reverse logic applies in that the section G Curve should be established before the spine is calculated, not after.
The next stage of surface development is the realization that the edges of the surface are free to wander, unless otherwise specified. If you wish to specify the edges, make sure that the curve that you specify touches the end point of each G Curve. This in effect seals the edge of the surface coincident with the end of the G Curves. The LIMIT curves must contact ALL of your G Curves, and be continuous in tangency in order to compute.
Finally, there may be instances where you are required to control the shape of the surface spans between the G Curves - some of you may see this stage as a potential Pandora's box, but it is not. Keeping in mind that the spans between the G Curves are also free to wander, Catia uses the simplest solution to cross the span. If you wish to take control of this solution and force an additional contour, keep the following in mind. MID-CURVES must touch every G Curve of the surface and be continuous in tangency -sound familiar? One set of rules for both the LIMIT and MID curves. There is another way to control the span, called an Area Law. I will leave Area Laws open for a later article, if you wish.
This has been a test of the surface definition system. SURF2+CURVE+CRV-CRV will never seem the same, once you understand these fundamentals. If I have been unclear on any of these explanations, I can be reached at mailto:firstname.lastname@example.org or call me at (316)218-2148. Explanations may cost you a game of BZFlag or PS3 MotorStorm, though. (see below)
See you at the Beach!