Education, Engineering, Entrepreneurial Mindset Learning, leadership, STEM

Citadel Engineering: Focused on Students, Not Rankings

So proud of our The Citadel Engineering faculty and staff for the national accolades for their student-focused education efforts. We as a team are focused on delivering the best engineering education experience and value for our cadets and students. The rankings may follow but aren’t our priority or focus. 

🏆 Proud that US News has noted us as a top 25 undergraduate engineering program (non-doctoral) in the nation for the 13th straight year!

🏆 Proud that we are tied as the 4th highest ranked public non-doctoral undergraduate engineering program in the nation. (We are state-funded not federally-funded or private.) 

🏆 Proud to be the #1 engineering undergrad program (non-doctoral) out of the senior military colleges. 

We are humbled because all this is made possible only by the support we receive from our State of South Carolina, industry partners, and alum. Congrats again to our faculty, staff, and leadership team!

#GoDogs!

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner. He is proud to have recently received a Generative AI for Large Language Models certification from DeepLearning.AI and AWS.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Ai, Artificial Intelligence, Design Thinking, Education, Engineering, Entrepreneurial Mindset Learning, Innovation, Technology

Looking “Under the Hood” of Generative AI

Is using AI like driving a car? You don’t have to know how to design a car in order to drive it around town. Back in the day, we had a whole class called “Driver’s Ed” taken during junior high school. In that class, we learned the rules of the road and got to sit behind the driver’s wheel for the first time and drive. My Dad, on the other hand, not only knew how to drive a car, but he would buy old cars (because we couldn’t afford a new one), take out the old engine, rebuild it, and put it back in. Using AI or machine learning algorithms can be similar to driving a car. Diving deeper into them would be more like repairing or designing a car. You don’t have to know how to repair or design a car just to drive it.

Driving “Artificial Intelligence” programs, or “Driving the Car”

Learning and using artificial intelligence is like driving a car. Nowadays, you don’t need to know how to create the algorithms yourself, you can just “drive them.” Like driving a car, you still need to know the rules of the road, how to evaluate where you are and where you are going, and be safe. But you don’t need to know all the intricacies of how an internal combustion engineer, and electric motor, to be able to design it. Back when I first started studying AI during my master’s in 1991, we had to either write our own AI code or find open-source software that we could add to. Today, with platforms like AWS Sagemaker, the algorithms are already coded. You can access and use them if you know which ones to use and how to string them together in Python code in a sequential fashion. You just need to read or take a class on how to use them. Thankfully, for educators at Community Colleges, HBCUs, Minority Serving and some PUI Institutions, AWS has set up an AI Educator Enablement program. I’m happy that five of our faculty have begun taking the AI Bootcamps offered in conjunction with The Coding School. This semester, I’m teaching a class modeled after AWS Machine Learning University’s Machine Learning Through Application course.

Designing “Artificial Intelligence” Programs, or “Designing the Car”

Learning artificial intelligence can also be like learning to design a car. Many of us are now familiar with ChatGPT, a conversational generative pre-trained transformer, that can generate new text for us based on our typed in prompt. Recently, I took DeepLearning.AI and Amazon Web Services (AWS) Coursera course on Generative AI with Large Language Models and was able to get the certificate of successful completion at the end. This course was more closely related to getting “under the hood” of the car and seeing what makes it work. I thoroughly enjoyed the course. I share a few things I liked.

Understanding the Generative AI Product Lifecyle

The GenAI course described the entire GenAI lifecycle including defining the scope, selecting the LLM to use, adapting and aligning the model, and application integration. The first part of this life cycle is understanding the use cases in higher education. We have to start with where it makes sense to use an LLM in an engineering course, for example. I’m excited that tomorrow we have our “Safely Exploring Generative AI for Faculty and Student Learning” design thinking session that is supported by the Kern Family Foundation and our new virtual Center for Artificial Intelligence, Algorithmic Integrity, and Autonomy Innovation (AI3). We have faculty from all five of our Schools (Engineering, Business, Humanities, Math and Science, and Education) coming representing about fifteen departments across campus. It’s imperative that all of our faculty begin thinking about the impact of GenAI on education generally and for engineering, how it will impact the way we prepare our engineers.

Pre-Training a Large Language Model

In the early days of AI, intelligent agents, or AI-enabled computer programs, were designed to “reason” symbolically using logic and inference engines. Today, the “reasoning” and “learning” in AI are done using statistical methods. In the course, LLMs are described as statistical calculators. LLMs take in large amounts of unstructured data, for example, from the internet, pass the data through a data quality filter, and create the LLM by running the pre-training algorithm on GPUs by updating the LLM weights.

How Pre-Training Works

LLMs essentially are trained to guess the next word in the text using a Transformer architecture, specified in the paper, “Attention is all you need.” A transformer consists of an encoder and a decoder. Depending on the purpose of the task, you can have encoder-only models, encoder-decoder models, and decoder-only models.

  • Autoencoder models, or encoder-only models, take the input words, or tokens, and try to learn to guess the masked tokens using a bi-directional context. They are good at tasks such as sentiment analysis, recognizing named entities, and classifying words. Example models are: BERT and Roberta.
  • Autoregressive models are decoder-only models and attempt to predict the next token in a text using a one-directional context. These are good for generating text and other types of tasks. This is the type of model GPT is. Example models are: GPT and BLOOM.
  • Sequence-to-Sequence models masks, or hides, random sequences of input tokens through the encoder. The decoder tries to reconstruct the span, or sequence of tokens, autoregressively. These are good for summarizing text, doing question and answering, and translating text. Example models include T5 and BART.

Tasks LLMs Can Do Well

Existing LLMs can do many tasks relatively well. These tasks include:

  • Essay writing
  • Language translation
  • Document summarization
  • Information retrieval
  • Actioncalls to external applications

Prompt Engineering Won’t Always Improve an LLMs Results

Those that are “driving the car” of LLMs know that they can specify what result they want to see using a prompt. There is a way to configure the LLM for the amount of randomness or length of response by modifying parameters for inference, including top k, top p, temperature, and max tokens. Modifying or writing a more complex prompt using a basic knowledge of how the LLM works can improve the results. This is called in-context learning and can involve giving examples of the prompt and the results. Giving no extra examples is called zero-shot inference, and giving one is called one-shot inference. Again, these things are covered in the DeepLearning.AI and AWS course, but I thought I’d mention them. When we start diving more into some of the theories, we are starting to get “under the hood” rather than just “driving the car.”

The Computational Costs for LLMs

Another aspect of the course that I liked is that it delved into a straightforward explanation of how much computing costs are involved. Those familiar with machine learning and cloud computing know that Nvidia GPU’s are the hardware engines that do all the compute processing required to train LLMs. The course helps us to realize that these ML algorithms in general, and LLMs, specifically require lots of computational processing power. A business or a highered ed institution conducting research will have to factor in these costs.

Techniques for Fine-Tuning the LLM

The course covers the methods used to fine-tune the LLM so that it can perform better at specific types of tasks. Although most casual LLM users are only familiar with the GPT models, there are others that exist and can be used. I just noticed that this blog post is getting long so I’ll end here.

Be Happy to “Drive AI” but Be Willing to Dive Deeper

In order to use AI, machine learning, or generative AI models, like LLMs, you don’t need to know everything under the hood. But learning how these models work will be helpful. Many people are complaining that GPT’s aren’t good at math. If you understand the architecture, you can see that they aren’t built for that. But they can be tied in with other applications that can do those things. I am hoping that as engineering educators, we can bring more understanding of AI, ML, and GenAI to the general public but also training others to design and build the next generation AI algorithms.

Picture: Participants in a recent, Safely Exploring Generative AI for Faculty and Student Learning – Using Design Thinking and Entrepreurial Mindset, sponsored by The Kern Family Foundation.

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner. He is proud to have recently received a Generative AI for Large Language Models certification from DeepLearning.AI and AWS.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Education, Engineering, Entrepreneurial Mindset Learning, Innovation, leadership, STEM, Technology

Capturing Purpose and Passion in a Mission Statement

One of our faculty courageously stated in our meeting today something to the effect, “I hate to say it. But why do mission statements sound so generic and lack passion and love? As a parent, I’ve seen a lot of these college mission statements, and I’m not sure I’ve seen many that connect with my child or with me as a parent.” Today, we hit that head-on with the help of Dr. Sonia Alvarez-Robinson. We were honored to have such a seasoned strategy consultant work with our School of Engineering to begin our new strategic planning process. If we do nothing else but refine and capture what was shared today in our breakout session, we will have succeeded.

I’m not going to give a rundown of the steps we took today but talk more about why we took them and the energy that we felt. As people shared their personal “why’s” for why we teach to students, for example, someone literally talked about having love for our students and a love of the discipline they are teaching. We also heard comments that empathized with how our students engage with and experience their engineering curriculum at our institution. To that point, another faculty member added that we are starting a new class for first-year engineering students that combines all of our disciplines so that students can be exposed to each one before making a long-term commitment to a major.

We also shared how it is important to give them the fundamentals, but also show them how to work with others in other’s disciplines on big problems that must be solved on a bigger, and sometimes global scale. Some of our Executive Advisory Board members, who themselves are alumni and executives in large engineering firms, stated how important interdisciplinary collaboration in the real world is. Interdisciplinary collaboration is one of our four strategic priorities, along with innovation throughout the curriculum, infrastructure for growth, and inclusion and outreach.

At the end of the session, we gave everyone the chance to share one word about how they felt about our session developed by Dr. Alvarez-Robinson to co-create our vision and mission. Words such as encouraged, informative, insightful, tiring, helpful, collaborative, productive, interesting, contemplative, and inspired were spoken. My word? Energized.

Picture: Many of our School of Engineering Faculty and Staff with Dr. Sonia Alvarez-Robinson (in blue) at our Initial 5-Year Strategic Planning session. (Thank you Michael Kelsh for taking the picture for us.)

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner. He is proud to have recently received a Generative AI for Large Language Models certification from DeepLearning.AI and AWS.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Ai, Artificial Intelligence, Cloud Computing, Computer Science, Design Thinking, Education, Engineering, Entrepreneurial Mindset Learning, Entrepreneurship, Innovation, Robotics, STEM, Technology

Safely Exploring Generative AI for Faculty and Student Learning

How is generative AI going to impact your career? How is it going to impact engineering educators and students’ learning experience? We are planning a faculty design thinking session to explore Generative AI (GenAI) for how it can be used to help students learn. Why? Everyone is curious about it, from students to CEOs, and we are all trying to figure out how it connects to what and how we teach, and how we can create value for our students by using GenAI. We are trying to identify the opportunity it presents for engineering educators and how we can scale it best for impact. Sound familiar?

What do we mean by “safely?” We recognize that most faculty may not have a background in AI and rely on what they hear from others. Many teachers are worrying about how to keep students from using it to cheat. There are many other fears that faculty and students have about AI. There is a feeling and fear of being left behind the technology curve and losing relevance in their future or current careers. Many students that are near graduation are worried they are not prepared for an AI-enhanced workforce. Faculty that have not been in the engineering industry for a while or possibly ever, may be unaware how GenAI is impacting the workforce or the military. Also, by “safely” we want to make sure that as educators, we make sure we limit the exposure of toxic GenAI output, respect intellectual property, and discern accurate and honest content. Faculty and students must learn how to use GenAI responsibly.

We are going to get all of these issues out on the table in a comfortable and open intellectual space. We are going to use design thinking to empathize with faculty and students. We are going to clearly define the needs, pains, and potential gains that faculty and students have related to AI in general and specifically to GenAI. We’ll take time to brainstorm potential solutions or “products.” We will be able to build some teaching prototypes and feedback on our ideas. Trust me, it will be a “safe” space to explore the technology’s impacts and learn how we can tackle the challenges together.

We are delighted to have Christina Hnova, from the University of Maryland Academy for Innovation and Entrepreneurship, facilitate our session. I met Christina through my past instructor at the Stanford d.school Teaching and Learning Studio, Dr. Leticia Brito Cavagnaro. Through Amazon Web Services (AWS) Machine Learning University and AI Educator Enablement Program, we have been able to begin preparing many of our faculty in the School of Engineering to teach and integrate AI. But we are also aiming to help faculty in other schools, including the Humanities, Business, Education, and Science, explore GenAI with curiosity, connections, and value creations for an interdisciplinary AI student learning experience. Come join us!

Picture: A curious learner with Kathleen, one of my Humanoid Engineering and Intelligent Robotics (HEIR) Lab undergraduate research students, during an AI-enabled humanoid robotics outreach event when I was at Marquette University.

Acknowledgments: Thanks to KEEN and the Kern Family Foundation for their support!

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner. He is proud to have recently received a Generative AI for Large Language Models certification from DeepLearning.AI and AWS.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Ai, Artificial Intelligence, Cloud Computing, Computer Science, Design Thinking, diversity, Education, Engineering, Entrepreneurial Mindset Learning, Innovation, STEM, Technology

Using Entrepreneurial Mindset and Making to Spark More Accessible AI

What do you think this is a picture of? How would you imagine it connects with a student learning artificial intelligence? A little more on that later but let me share a little on the process of why and how I arrived at this little contraption to teach some AI and machine learning concepts.

KEEN MakerSpark: A Framework for Developing Entrepreneurial Mindset Activities

This week I participated in the KEEN MakerSpark workshop. We looked at how to use “making” and the three C’s of an entrepreneurial mindset (curiosity, connections, and creating value) to improve our engineering curriculum. Since I teach AI and machine learning, I was curious how I could use “making” to visually and tactile”ly” demonstrate how machine learning works to a college student or even a child. In this context, “making” refers to physically making something with your hands. In the context of the 3 C’s, the making is driven by a student’s curiosity, their need to make connections from disparate information, and prototyping a concept or an idea to create value.

Deconstruct/Reconstruct Troublesome Knowledge

In teaching, we often want students to learn a new concept. But there is more to teaching a concept then giving a student a definition, equation, or example. We need to deconstruct all that we know that the student knows and how they arrive at that concept. Troublesome knowledge consists of those engineering or computing concepts that our students seem to struggle with the most. Working backwards, starting at this troublesome knowledge, we then design a learning activity using objectives with observable outcomes and ways to measure their learning.

I identified what makes some introductory AI knowledge “troublesome.” Students may not know that machines can “learn”. Students may not understand the different ways machines learn. Yes, at this point I could put up some complex math equations that explain machine learning, but what does this mean to a middle school student trying to learn the basics in a visual and tactile manner?

Defining Success and Struggling in Learning

Working backwards, we can define what concept we want to and things we can observe that shows that they have mastered the concept or not. These learning objectives should state clearly what we want the student to learn, by when (e.g. end of class), and what’s the observable way of telling they learned it. A concept in AI that I want students to learn is what is unsupervised machine learning classification versus supervised machine learning classification. They are struggling if they can’t identify what it is visually.

Modeling the Knowledge Using Analogies, Sketches, Data Physicalization, or Stories

To ask myself how I could model this knowledge, I thought of analogies to unsupervised/supervised machine learning classification. I thought of analogies, metaphors, similies, and stories and drew sketches. I won’t list them here, but it involved me drawing sketches with stick figures and drawing what I know about this topic. I then brainstormed ideas about how to physically show the data as it flows through a machine learning classifier or neural network, or some other teaching tool or experimental model. I picked one of the example ideas and decided I would make a simple “maker” exercise for students to try. Hence, the “contraption” I made in the picture with a tube, holes, and small BB and marble-sized balls of different colors. The fun part of this process is that the instructor gets to “make” a low-fidelity prototype proof of concept that will guide what the instructor will then instruct the students to “make,” but not necessarily the same prototype. In other cases, the instructor’s prototype will be the basis of the learning activity the students use. For example, one of the faculty, Mark Ryan, created a prototype game to teach “for” and “if” loops for non-computer scientists.

Prompting the Student to Make Prototypes and Use them to Assess their Learning

After explaining the concept of unsupervised/supervised machine learning classification, I would prompt the student to make something that demonstrates the concept. I wouldn’t want to give them the answer but be there to give them hints and clues and positive encouragement to think of analogies and metaphors themselves. I would instruct and encourage them to use the low-fidelity prototype materials (a.k.a. craft supplies) to build their prototypes and test them on other students. If I’m being kind of vague, it’s because I want to try this out on some of our students this fall to see what I learn first.

Innovating throughout our Engineering Curriculum

I am grateful for the many teaching innovations that I have been able to experience and learn through workshops like Stanford’s d.School’s Teaching and Learning Studio and the KEEN Network’s MakerSpark I just went through in Boston, which I learned these concepts through that I’m able to share. As an engineering leader, I’m grateful that the Kern Family Foundation provides these opportunities for all of our faculty to learn to innovate in their classrooms from other faculty in the KEEN Network. The opportunity is there, and it’s up to us to seize them and make them a reality in our students’ learning experiences.

Picture: A low-fidelity, hands-on teaching model for students to use data physicalization and making to learn the concept unsupervised and supervised machine learning classification.

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Artificial Intelligence, Cloud Computing, Computer Science, diversity, Education, Engineering, Entrepreneurial Mindset Learning, Entrepreneurship, Innovation, leadership, Robotics, STEM, Technology

Affirm Action to Empower Our Children’s Engineering and Computing Education

Who empowered or encouraged you growing up to pursue your education? I can name many, from my elementary school teachers, Sunday School teachers, parents, and older siblings. My Mom only had the equivalent of a third-grade education in Korea and learned to read English at an elementary school level. But when I would come home from school, she would say to me, “I believe Andrew makes all A’s.” She continued saying that up until college. She instilled academic confidence in me at a very early age. My Dad had a high school education growing up in Connecticut. He continued to learn mechanics in the Army by daily reading newspapers and magazines like Popular Mechanics and Popular Science. He could only afford old cars but would rebuild the engines all by himself in the backyard so he could transport all six of us kids.

My Sunday school teacher, Sharon Scoggin, was one of the few people I knew outside of school that went to college. I have known her since I was three years old. She and others would have us read Bible stories from the King James Version, so little did I realize I was reading at around the twelfth-grade level by the time I was in second grade. My oldest brother, Robert, led us down the path of college possibilities by enlisting in the Air Force and later being sent back to study electrical engineering at Kansas State, near our hometown. I remember the first time I saw his little LED circuit he created with Boolean logic, and I was fascinated. I decided I wanted to become an electrical engineer like him.

By the time I reached high school and took the PSAT test, I did really well. I was getting recruitment letters from Harvard, Stanford, MIT, Brown, and other places. I started to apply to Harvard and even went through the alum interview process. But I never completed the application. My Dad suggested I go to KU just down I-70, and then maybe go to an Ivy League school later. I guess he was scared we couldn’t afford it on his $100 a week paycheck and his $300 a month Army retirement check. He didn’t know much about college loans or scholarships. He pointed out that my other brother had gone to KU to study aerospace engineering, worked as an engineer for a couple of years, and then went to MIT to get two master’s degrees.

I sometimes wonder what would have happened if I had completed my application. Would I have been accepted to Harvard? Who knows. I do know that I’ve lived a blessed and prosperous life going to public schools all through high school and a public university. I have benefited from policies and benefactors that encouraged companies and institutions to recruit, hire, and retain individuals from various racial and ethnic backgrounds. Hey, I even had Steve Jobs, co-founder of Apple, see the benefit of hiring black engineers to work there, so much so that he hired me to help Apple do that. That’s a lot from one of the most impactful innovators we have seen in our modern day.

I grow concerned not so much about how policies are changing but about the message it gives to our young people. Our message to our young people should be that they are welcome and belong at our most elite institutions as well as all the other institutions that will provide a world-class education, starting with elementary school to community college to universities. The latest policy changes worry me that they will impact not only our educational thoughts and policies about race but also those related to gender. I want my daughters to always know that they belong in the engineering and computing field. But I also want other’s children to have the same types of opportunities that my children have because of the privileges I’ve been able to earn with God’s help. I had to sign a waiver to have my son be able to take Calculus in high school in Atlanta because as a black boy they didn’t think he could succeed in it (which, of course he did and went on to get a computer science degree). That’s why I am passionate about helping other parents and children get excited about learning engineering and computing, and I hope you will consider doing the same.

Remember, the institution one studies at does not necessarily equate to happiness, success, or a good life. I and others like me are a testament to that. From someone who as a child grew up in a low-income first-generation home, I know that as a fact. My Mom didn’t have a college education or even a high school diploma let alone a degree from an Ivy League school. But she had something more in the spiritual realm. As my Sunday school teacher had me read as a child, “Blessed are you who are poor, for yours is the kingdom of God.”

Picture: William Farr and me learning on an Apple II computer at Junction City High School back in the ’80s.

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Ai, Artificial Intelligence, Cloud Computing, Computer Science, Education, Engineering, Entrepreneurial Mindset Learning, Innovation, Robotics, STEM, Sustainability, Technology

Real Intelligence: Remembering the Value and Superiority of Humans over Machines in the Age of AI

“The only way AI will cause humanity’s existence to cease is if humanity only primarily invests in AI but fails to value and invest in humanity itself.” – Andrew B. Williams

The narrative for AI, particularly generative AI, is being set in the media and national stage. Some say it will eventually cause our civilization to end. It will cause white-collar and knowledge-worker jobs to slowly disappear. No one is safe. So we say. Also, the narrative says there is an AI arms race, and we must invest as much capital as possible into this new technology.

One thing saddens me about this narrative. There are so many more “things” that are so much more capable and amazing than generative AI. So many more things worthy of our investment of time and money. They are not actually things. They are individual humans themselves.

I appreciated the value and superiority of humans afresh listening to Dr. Bobby Kasthuri, neuroscientist at Argonne National Labs, speak at the KEEN National Conference in January 2023. Although computer scientists have created artificial neural networks very roughly modeled after how an aspect of the brain works, no one has actually mapped the connections in the human brain. One individual human’s brain has about 100 billion neurons that, each have 1 to 10,000 connections to other neurons, creating a quadrillion connections in a single human brain. That’s 100 to 1,000 times more stars in the galaxy!

Dr. Kasthuri is seeking to build a supercomputer system that’s large enough and powerful enough to map all of the connections in the human brain. He believes this map of our connectome will lead to an understanding of the physical basis for all of our memory, skills, hopes, thoughts, and dreams, much like the mapping of the human genome has done for understanding the genetic basis of life. And also, we may be able to determine how we experience the misfiring of the brain that produces pathological effects, such as autism and schizophrenia. He worked on mapping the connectome of a mouse. Mapping the human genome resulted in about 200 GB of data. Mapping the mouse connectome is about 2000000000 GB of data. Imagine how much data mapping the human connectome will produce.

Recently I watched a 60 Minutes story on AI. The reporter seemed so amazed at what the generative AI (genAI) model produced. They later checked the books that the genAI model had recommended and found that they were made up and not real. Also, recently, I read that a comedian is suing a company for its genAI model because it summarized her book without ever purchasing it. Could it be that the, should I say, “scam” of some genAI is that it’s “stealing” the data produced using existing human creative works of art, literature, audio, video, etc., to recreate “original,” AI-generated pieces? Is that “amazing,” or is it unethical?

I could end there, but I wanted to say that we need to be re-amazed by how amazing real humans are. As Dr. Kasthuri said, yes, an AI program can beat a human at chess, but an AI-enabled robot can not walk up to the table, sit down, move pieces, and think the way a real human can. In a future blog post, I plan to write about the environmental impact involved in training these genAI models, an issue that was brought to light by Dr. Timnit Gebru, formerly of Google, and others. But with all the power, energy, and natural resources required to train one of these large GenAI, or Foundation model, needs, the human brain only operates on about 20 Watts of power. Less power than most light bulbs. In contrast, the supercomputer Dr. Kasthuri plans to build will require the power from an entire power plant.

So let’s just not only tout the benefits and possibilities of generative AI and say we must invest billions of dollars to develop and sustain it. Let’s not forget about the human child down the street, one that has incredible brain capabilities but living in poverty, or even in another country that is much more amazing, intelligent, and worthy of our investment into their education, well-being, and future than a machine that copies our original work so others can tout it as their own. Let’s invest billions in our own human children. One of those children may grow up with a big dream to map the connectomes of the human brain to help others learn and treat brain disease. One of those might grow up and become a Dean to lead others to become engineers and a leaders that build a better world.

“It’s more important to invest in, encourage, educate, and empower a human child than it is to invest in a machine, computer, or algorithm.” — Andrew B. Williams

Picture: Dr. Narayanan “Bobby” Kasthuri and me at the 2023 KEEN National Conference in Atlanta, where he spoke on mapping and understanding the brain so that we can learn, teach, and cure diseases of the brain. His fascinating talk can be found here.

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Artificial Intelligence, Cloud Computing, Computer Science, Design Thinking, diversity, Education, EML, Engineering, Entrepreneurial Mindset Learning, Entrepreneurship, Innovation, leadership, Robotics, STEM, Technology

Educating Innovative Engineers that Impact the World

Because of the challenges our nation, military, and world are facing, we are seeking to educate and train the next generation of principled engineering leaders and innovators, by setting our School of Engineering direction around four strategic priorities that include infrastructure, innovation, interdisciplinarity, and inclusion.

On June 30, 2023, I finished my second year as Dean of Engineering at The Citadel, and I’d like to take a moment to reflect and celebrate what we have accomplished. We began with a vision of educating innovative engineers that impact the world through principled leadership. Why? The world is facing enormous challenges related to health, sustainability and resilience, energy, security, infrastructure, education, poverty, information, artificial intelligence, clean water, and more. We need the next generation of engineers to be prepared to innovate to solve problems that take into account the solutions’ economic viability, systems interconnectedness, need for multidisciplinarity teams, understanding of various cultures, and impacts on diverse communities and our global society. Our strategic priorities to prepare our students to face these challenges are:

  • Infrastructure for Growth (Physical and Virtual)
  • Innovation throughout the Curriculum and Co-Curriculum
  • Interdisciplinary Collaboration across Departments, Schools, and Industries
  • Inclusion and Outreach

What have we been able to accomplish in the past two years?

INFRASTRUCTURE FOR GROWTH (PHYSICAL AND VIRTUAL)

We have been able to raise $53.5 million for our new Engineering building from the South Carolina Legislature. We are very grateful for the support of the State for Engineering at The Citadel. It was a huge team effort from those who lead, work, graduated from, are friends of, and are affiliated with The Citadel. Over the last two years since I’ve been Dean, We have set records for how much The Citadel has ever received from the state. We know that it’s a strategic investment in the future workforce and industry here, and we could not be more happier and prouder for all those who are friends and faculty of The Citadel School of Engineering.

For our virtual infrastructure, we piloted a class that taught students using the Amazon Web Services (AWS) Cloud infrastructure for our introductory artificial intelligence (AI) course for decision-makers. More on that later in this blog post.

INNOVATION THROUGHOUT THE CURRICULUM AND CO-CURRICULUM

We were invited and joined an exclusive set of engineering institutions that focus on teaching faculty to educate innovative engineering students. As one of 50+ invited to become a KEEN Partner Campus, we have been provided $100,000 so far for entrepreneurial mindset (EM) faculty development workshops and conferences that teach our faculty to integrate entrepreneurial mindset learning (EML) into our engineering courses. We have had around 38% of our faculty involved in KEEN through national conferences, faculty workshops, and developing curriculum innovations in the classroom. These workshops have ranged from entrepreneurial mindset topics that include leadership, maker spaces, inclusion and diversity, engaging industry, problem-solving studios, developing faculty mentorships programs, and more. And in less than a year, we already have two Engineering Unleashed Fellow nominated faculty, Dr. Deirdre Ragan and Dr. Simon Ghanat, who are being recognized by their peers nationally in recognition of their contribution to engineering education, specifically entrepreneurial engineering. Dr. Mostafa Boutali has become one of our designated KEEN leaders and is working with others to integrate an EM strategy for the School. Another innovation that we have introduced is our first-year engineering experience, BIIG DOGS (Belonging Innovative Instruction Groups Designed for On-time Graduation Success, sponsored financially by Hytrol and the Kern Family Foundation.

INTERDISCIPLINARY COLLABORATIONS ACROSS DEPARTMENT, SCHOOLS, AND INDUSTRIES

We were excited to join Clemson University and eleven other institutions across South Carolina on a successful, interdisciplinary $20 million, five-year NSF EPSCoR grant to transform healthcare across the state through AI-enabled biomedical engineering devices. The $1 million funding The Citadel School of Engineering will receive will enable us to form and operate our Center for AI, Algorithmic Integrity, and Autonomy Innovation (AI3) to improve and develop our future AI-literate and proficient healthcare workers and K-12 students. We’ll focus on K-12 outreach to underserved rural and urban schools, creating learning pathways in AI for all Citadel students, not just engineering students, and conduct research with our partners in ADAPT for SC. Our successful application was due in part to our School of Engineering AI task force that included Drs. Siripong Potisuk, Kweku Brown, and Nathan Washuta and our use of design thinking as a strategic planning process.

With our partnership with Amazon Web Services (AWS) Machine Learning University (MLU) and through my role as AWS MLU Advisory Board Member, we are working to educate faculty at HBCUs, MSIs, and Community Colleges across the nation in AI and Machine learning. This summer, five of our engineering faculty are going through AWS MLU AI Boot Camps to begin to prepare them to integrate AI into our engineering curriculum. As the saying goes, AI isn’t going to replace engineers, doctors, and educators. Engineers, doctors, and educators are going to be replaced by engineers, doctors, and educators that know and use AI effectively.

We are also excited that faculty in three of our departments, Dr. Greg Mazzaro, Dr. Kevin Skenes, and Dr. Timothy Woods, are working on an interdisciplinary first-year introduction to engineering course that will be piloted this Fall, 2023. This project was initiated and funded through School of Engineering Vision Grants we made upon the recommendation of our faculty-led Vision Task Force we began in 2021. The Vision Task Force included Drs. Timothy Woods, Dierdre Ragan, Nahid Vesali, Nathan Washuta, Siripong Potisuk, Ronald Hayne, and Kweku Brown.

INCLUSION AND OUTREACH

Our School of Engineering developed a diversity plan in 2021 and was awarded the American Society of Engineering Education (ASEE) Diversity Recognition Bronze Level. And our efforts continue with our School of Engineering Inclusion and Outreach Task Force. Led by Dr. Mark McKinney, the task force helped in our ABET Self-Study optional Diversity, Equity, and Inclusion criteria report for our new Computer Engineering ABET application. Our newest and first woman and Latina Electrical and Computer Engineering faculty member, Dr. Sylmari Montero-Davila, participated in the KEEN Enhancing Inclusive Teaching Practices through Entrepreneurial Minded Learning (EML) and is working on a project to help all of our faculty in this area. Dr. Eva Singleton became our first African American woman faculty in our engineering program’s 180+ year history this January. When I first arrived at The Citadel in 2021, I was warmly welcomed and recognized by The Citadel African American Alumni Association (CA4) members as the first Black Dean at The Citadel and the School of Engineering ever. Among those who welcomed me were Arnold Benson, ’73, the School’s first African American engineering graduate, and Dr. Larry Ferguson, ‘73. There is much more work to do, but together, we will succeed. To God be the glory.

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
Ai, Artificial Intelligence, Cloud Computing, Computer Science, diversity, Education, Engineering, Entrepreneurial Mindset Learning, Robotics, STEM, Technology

AI3: Launching Our Center for AI, Algorithmic Integrity, and Autonomy Innovation at The Citadel

Recently I was a keynote speaker on AI at the Gulf Coast AI conference at Houston Community College and was also able to present at the AWS AI Educator Enablement Webinar on “Bridging the AI Divide.” I am so excited that our partnership with Clemson University and other distinguished higher educations institutions in South Carolina has resulted in a $20M National Science Foundation grant, with The Citadel School of Engineering receiving over $1M over 5 years towards funding our new Center for AI, Algorithmic Integrity, and Autonomy Innovation (AI3).

Clemson leads the NSF EPSCoR Artificial Intelligence-enabled Devices for the Advancement of Personalized and Transformative Healthcare in South Carolina, or ADAPT in SC project. They graciously invited The Citadel and other PUIs, HBCUs, and a community college, along with the Medical University of South Carolina (MUSC) and the University of South Carolina, to participate. What will our part be?

As I’ve shared in previous blogs, we are being impacted by the AI Divide. Those who have the hardware, software, capital, algorithms, and products for AI and those of us who provide the data to these companies, most of the time, for free. AI has the ability to exploit people’s lives but also the power to enhance and empower their lives. With ADAPT for SC and The Citadel Center for AI3, we will be working collaboratively across the state to use the power of AI to enhance and transform the healthcare and lives of South Carolinians.

We are excited for our faculty, staff, and students to participate in aspects of biomedical engineering and AI research, workforce development, and K-12 outreach. I am able to build on the work I was able to do through the National Science Foundation National Robotics Initiative for culturally responsive middle school AI camps. I am also able to leverage the work we have been doing with the AWS AI Educator Enablement program to bring about curriculum at The Citadel that will provide learning pathways for AI literacy, proficiency, and mastery for all of our Citadel cadets and students. Our Center for AI3 will exist virtually until it is housed in our new engineering building.

Thank you, Clemson, for your leadership and collaboration, the National Science Foundation, our leadership at The Citadel and our faculty in engineering, the other ADAPT in SC partners, and our partners at the AWS Machine Learning University Educator Enablement Program.

Picture: Dr. Andrew B. Williams speaking at the Gulf Coast AI Conference in March 2023, provided by Neethi Gangidi.

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard
diversity, Education, EML, Engineering, Entrepreneurial Mindset Learning, Innovation, STEM, Technology

Rethink and Unlearn: Creating a Community of Learners #KEENBookClub

When it comes to considering new ideas, would you consider yourself more of a politician, prosecutor, professor, or scientist? This week our new KEEN Book Club began our discussion on the Adam Grant’s book, “Think Again: The Power of Knowing What You Don’t Know.”

A Politician, Professor, Prosecutor, and Scientist Walk Into the Room

Often, without us realizing it we can take our stance on a topic of discussion as the only one who has the right idea and the other side is wrong in order to please others (a politician), as one who’s idea is the only right idea (a professor), as someone who shoots down everyone else as ideas as wrong (a prosecutor), or as a person who is willing to decide if an idea is a good or the right one based on evidence (a scientist). In other words, a scientist is willing to rethink old theories and adages and “unlearn” them in order to make new discoveries. The scientist approach really emphasizes the curiosity aspect of an entrepreneurial mindset, something that is the focus of faculty and institutions that are in KEEN, the Kern Entrepreneurial Engineering Network.

What Constitutes an Education

We are just getting started on our book discussion but one of our faculty quickly asked, how do we get our students to rethink ideas? One of our faculty said there are two approaches to education: filling students’ minds with knowledge or teaching them how to think. He adheres to the latter. He shared an experience where he let his students know that he didn’t know something about a particular engineering topic and he was surprised how much of a positive impact it had on students as he reviewed his course evaluations at the end of the semester. As the book described, the rethinking cycle is a big contrast to the overconfidence cycle, or always believing you know the right answer or your way is the best way.

The Overconfidence Cycle

The overconfidence cycle starts with pride, then conviction, then confirmation and desirability bias, and validation. Then the cycle repeats starting with pride. Because of our pride of being right, we can selectively search for evidence that proves our idea or ways of doing things are right while dismissing any evidence that contradicts our belief.

The Rethinking Cycle

The rethinking cycle starts with humility, or the admission that we don’t know everything and are not always right. Recently, the media has been surprised by how much “intelligence” large and heavily trained AI language models have as they spit out answers or essays that are often, but not always, good or correct. It is said that ChatGPT-4 can pass the bar exam (I need evidence for that). If this is true and the AI language mode is trained on only 1% or 2% of all the known knowledge, think about how little we do know. How many of us can pass a lawyer’s bar exam without ever studying for it?

For the rethinking cycle, after humility comes doubt, curiosity, and discovery. Do we lack the courage to admit we don’t know something and risk the doubt that comes with it. Sometimes, we can tie our identity so closely to what we think we know and think to be true that we are afraid to truly consider all angles of an idea or a product decision. Thinking like a scientist, or someone willing to experiment and examine the evidence before drawing conclusions, does not come naturally and we often need to “unlearn” the thing we always thought to be true.

The Frog in the Hot Water

One of the examples Grant provides of a “true” idea most people believe without questioning it involves the “truth” about the frog and the hot water. It is said that if you throw a frog into hot, scalding water, the frog will sense the heat and jump out immediately so it won’t die. On the other hand, it is said that if you allow the frog to stay in the water while it gradually heats up, it will die because before it knows it’s hot, it will have been burned to death. Grant examined the evidence and found that this old adage is not true in all cases. In fact, if you throw a frog into scalding water, it may be too hot and die immediately. And if the water slowly heats up, the frog will notice the unbearable heat and jump out before it gets too hot.

A Community of Learners

What will it take at your organization or institution to build a community of learners? How often have you heard where you work or study that it’s not done that way here. Or that will never work here. How about, that’s too complicated. Or that’s not my experience so it must not be right. As engineering educators, we need to help our faculty realize they need some humility and openness so they can in turn teach their students to rethink and unlearn if we want to successfully a build a community of learners. In this type of community, the faculty can also learn from the students.

Conclusion

To be honest, now that Grant has raised doubts in my belief in the “frog and hot water” adage, I’m not sure who or what to believe. Now he’s made me curious to discover what is true about whether or not that frog will live or die. Isn’t that what rethinking and unlearning is all about?

Picture: A Picture of The Citadel Engineering and English faculty at the start of our first KEEN Partner Campus Book Club using the book, Think Again, the Power of Knowing What You Don’t Know, by Adam Grant

© 2023 Andrew B. Williams

About the Author: Andrew B. Williams is Dean of Engineering and Louis S. LeTellier Chair for The Citadel School of Engineering. He was recently named on of Business Insider’s Cloudverse 100 and humbly holds the designation of AWS Education Champion. He sits on the AWS Machine Learning Advisory Board and is a certified AWS Cloud Practitioner.  Andrew has also held positions at Spelman College, University of Kansas, University of Iowa, Marquette University, Apple, GE, and Allied Signal Aerospace Company.  He is author of the book, Out of the Box: Building Robots, Transforming Lives.

Standard