Advancing Semiconductor Innovation: A Conversation with Aparna Mohan
Aparna Mohan is an accomplished Design Verification Engineer based in Austin, Texas. With a strong educational foundation, including a Master of Science in Electrical and Computer Engineering from North Carolina State University and a Bachelor of Technology in Applied Electronics and Instrumentation from the University of Kerala, India (where she was recognized as a Third Rank Holder), Aparna combines academic excellence with extensive industry experience. Her professional journey spans from satellite technology at the Indian Space Research Organization to her current expertise in semiconductor design verification, where she has contributed to 14 successful ASIC product tape-outs.
Q 1: What made you get into the area of design verification engineering?
A: I got interested in design verification because I love solving complex problems and working toward quality assurance in the technology products. I had always found the challenge of the possible occurrence of issues before they really become problems in actual application a fascinating interest. For me, the verification stream is an opportunity to use analytical thinking, while still working with fascinating technologies in the semiconductor arena. Another great feeling is that I could help create reliable, high-performance chips used in millions of devices globally.
Q 2: With your experience in both formal verification and UVM methodologies, how do you decide which is more suitable for a project?
A: My outlook is to use the right tool for the right job. For corner cases and properties of interest, formal verification can usually give an absolute guarantee, particularly for control paths. In contrast, simulation-based verification (UVM) randomly generates test cases with constraints to check for functionality. In terms of my general approach, I’d go for formal for state machines considered critical, for proof of security features, and for places where exhaustive proof is a must and simulation just for end-to-end functional verification. A hybrid approach is perhaps the strongest strategy: formal technology to prove some essential properties indicating that the design is correct, and simulation to validate overall functionality and assess integration.
Q 3: How do you approach the verification of increasingly complex SOC architectures with multiple IP blocks interacting?
A: Verifying complex SOC architectures requires a hierarchical approach along with thoughtful integration testing. I usually start by ensuring that individual IP blocks are thoroughly verified with in-depth focus on their own testbenches which would cover all different kinds of operational scenarios in which an IP would be required to function. For system-level verification, I stress the interface between the IP blocks and their interactions since this is where the most challenging bugs usually come up.
What I have found is that one must create elaborate verification strategies focusing especially on the cross-module interactions. It implies concocting numerous possible scenarios where many IP modules work concurrently with other IPs in various configurations. I include system-level assertions that monitor protocol compliance and data integrity across interfaces.
Another important technique has been the use of system-level coverage-driven verification using specially crafted coverage metrics for integration points; this provides a guarantee that we are testing most interactions without getting hopelessly lost in the state space of the entire system. Also, I align with system architects to understand the expected traffic patterns and use cases so I can focus my verification onto the most critical paths through the system.
So far, experience has guided me to the development of abstract models that capture essential behaviors in highly complex interactions. The advantage behind not getting overly lost in the simulation complexity is if, even in a short simulation run, we are able to tap into performance using more than one scenario.
Q 4: How has your experience at the Indian Space Research Organization influenced your approach to the ASIC verification process?
A: My experience at ISRO instilled in me a rigorous approach to quality and reliability that has been invaluable in ASIC verification. Working on space technologies, where failure is not an option, taught me the importance of thorough testing and meticulous attention to detail. The experience of developing systems for the Indian Mars mission gave me perspective on how electronic components must function reliably in extreme conditions. This background has shaped my verification philosophy: I approach each chip as if it were going into a mission-critical application. Additionally, my experience with both RTL design and verification at ISRO gave me insights into designers’ perspectives, which helps me collaborate more effectively with design teams today.
Q 5: How do you gear up with fast-changing verification methodologies and tools?
A: It is a continuous learning process, and one has to keep pace with the verification community and industry dynamics. I attend industry conferences for the purpose and participate in technical forums on topics related to advancement in verification. I value contacts from whom I can learn and adapt to industry trends. I also read different technical publications to keep abreast of advancements in the field. Try to take some advanced training that aligns your knowledge to the new tools and methodologies as and when they are introduced. Presenting at innovation conferences obliges me to formalize knowledge and be prepared to learn from other researchers. Furthermore, I strongly feel about learning by doing – if new methodologies show their potentials, I implement them on my own to practically explore the various features and learn from failures, sometimes in mini test projects.
Q 6: What are the premier tools and techniques for debugging such complex verification issues?
A: Proper debugging will provide full solutions only with a right mix of powerful tools and a systematic methodology. Very powerful tools are there; however, the most notable ones for me are Verdi for improving waveform analysis, which helps with finding the bug and JasperGold or Formality for formal verification. In order to debug a very complex problem, it often proves useful to make a divide-and-conquer attempt-where you manage to take a failing test and complicate it to the point of simplification where you can make it fail even under the new environment. Again, debugging as a systematic procedure includes well-oriented assertion placements, data flow visualization productions, and a further optimization outside of the verification scope, and then verify Java assertion. Thoroughly specific coverage can include formal properties proving for specific data values. A verification engineer usually benefits from keeping a debugging log for each type of debug, which produces hypotheses, clear findings, and the situation the error was seen in. This is a very good strategy that puts a precaution on a circular investigation by referring through different issues.
Q 7: How do you manage mentoring and the leading of your verification team?
A: My outlook on leadership rests on giving agents some power while effectively leveraging technical expertise. I teach individuals to retain technical and verification techniques, emphasizing the thought process behind the activity—by giving them the chance to understand why they have worked better with different verification challenges than others. Finally, I educate myself and set technical benchmarks for my fellow career engineers in an engaging environment, rather, than manage them. While setting clear targets and schedules, I give some space to each member to choose the most suitable way on how to satisfy those targets. I am also a proponent of knowledge advancement within the verification team, whereby I set regular informal sessions and encourage some team members to share; so, the boundary issues that get avoided and hidden usually get to be aired, sooner or later. In all my dealings-though everybody makes mistakes-I do offer sound suggestions where they involve growth in real practice instead of just reactions.
Q 8: What guidance would you offer to an individual venturing into this territory, i.e., beginning their profession in semiconductor verification?
A: Make a strong foundation in verification methodologies and digital design principles, and always know whether you’re verifying it. Be curious and determined to dig deeper into issues – the most interesting bugs rarely present themselves easily. Ask for mentorship for guidance in one’s growth process, but trust one’s own observations and insights. Break it down into manageable parts and realize that it is complex technology. Be flexible with the methodologies as they change. However, the rigorous principles of verification will eventually become the same. Finally, speak clearly about one’s work; a verification engineer stands between design intent and realization, so communication is as important as the technical aspect.
Q 9: How do you see design verification in the future going down in a few years?
A: I think we are at an inflection point in verification technology. The advent of machine learning and generative AI is changing how we approach coverage generation and test optimization, and I believe we will see the proliferation of test generation and bug detection automation with AI assistants suggesting verification strategies based on design patterns. As system complexity increases and a larger portion of that moves onto firmware, hardware-software co-verification will become even more important. Formal verification will continue to grow in importance, and AI tools can help in generating the testbench, assertions, and binds. Virtual prototyping and emulation will also evolve to handle increasingly complex systems. Throughout this transformation, the function of the verification engineer will change from writing testbenches to architecting comprehensive verification strategies that are implemented through these advanced technologies.
Q 10: Which is the most outstanding achievement in your career? What’s your vision for your long-term professional journey?
A: I am particularly proud of putting together a super reusable Digital and Mixed Signal verification framework that was implemented across multiple product lines and dramatically increased the efficiency and quality of verification. This required skills and understanding of both digital and analog domains and developing models that accurately reflected an under-understood mixed-signal behavior while keeping simulation performance high. With that said, I envision being in a position to lead a verification strategy at an architectural level, where my influence will be carried to the earliest stages of how verification is integrated into the product life cycle. I am particularly interested in developing hardware security verification methodologies, including formal techniques as increasingly connected and security-critical systems emerge. I envision being able also to participate in advancing the wider verification community through publications and talks, sharing knowledge that makes collective advances in the field.
About Aparna Mohan
Aparna Mohan is a Design Verification Engineer with 11+ years of expertise in pre-silicon verification and methodology implementation. With contributions to 14 taped-out ASIC products, Aparna specializes in functional verification methodologies (UVM, System Verilog), SVA and formal verification techniques. Her educational background includes a Master’s degree from North Carolina State University and a Bachelor’s degree from the University of Kerala, where she was recognized as a Third Rank Holder. Before entering the semiconductor industry, Aparna worked at the Indian Space Research Organization, contributing to satellite technology and the Indian Mars mission. She has published research papers in international conferences and regularly presents her innovative verification methodologies at industry events.
News