All posts by Jennifer Adams

Meeting the students from Longwood and Kings Park Robotics Club at CEWIT

Kings Park and Longwood High School First Robotics Showcase

Robots, Reefs, and Rising Talent: Reflections from the CEWIT K-12 Robotics Showcase

On May 29, 2025, I had the opportunity to attend a remarkable student robotics showcase at Stony Brook University’s CEWIT (Center of Excellence in Wireless and Information Technology). Hosted in collaboration with the Office for Research & Innovation, the event brought together robotics teams from Kings Park and Longwood High Schools for a morning of demos, dialogue, and discovery.

As someone passionate about STEM education, it was inspiring to speak directly with the students—not just about their robots, but also about their strategies, goals, and future pathways. They shared how they had spent the season analyzing the competition, scouting other teams to understand how different alliances approached the same challenge. Their enthusiasm was infectious, and their grasp of both engineering and game dynamics was impressive.

Inside the Challenge: REEFSCAPE presented by Haas

All the showcased robots were built for the 2025 FIRST Robotics Competition game, REEFSCAPE. In this ocean-themed match, student-built robots are tasked with completing a series of complex actions:

  • Coral Seeding: Robots carry and place PVC tubes—representing coral—onto various levels of a reef structure. The higher the placement, the more points earned. If coral is seeded across all four levels, teams gain a crucial ranking point.

  • Algae Delivery: Robots harvest “algae” and deliver it either to a team net or to a processor, where human players on the opposing alliance score it into the net.

  • Barge Retrieval: In the final dramatic moments of the match, robots race back to their alliance’s barge, grab hold of a cage, and lift themselves completely off the ground—no small feat, considering the weight and precision involved.

The students explained how they engineered their robots to meet each task and adjust mid-season as they learned from other teams. They lit up when asked to explain their mechanisms, code, or strategy shifts, and they genuinely enjoyed talking to people who understood the nuances of competitive robotics.

A Championship Season

It was particularly exciting to hear about their journey to the 2025 FIRST Long Island Regional, where Team 564 (Longwood RoboLions) and Team 5736 (Kingsmen Robotics) joined Team 353 (POBots) to win the event. Team 564 also took home the FIRST Impact Award, the highest honor in the competition, recognizing their outreach, innovation, and leadership.

That success earned them a spot at the FIRST Championship in Houston, Texas, where Team 564 competed in the Johnson Division. While their official record was 2–8, their participation on the global stage—among the top teams in the world—was a major accomplishment and a source of pride for their schools and mentors.

Looking Ahead

As Derek O’Connor, CEWIT’s Workforce Development Manager, shared in his follow-up message, the students were thrilled by the experience:

“They’re not used to speaking with people so knowledgeable about these projects… they really enjoyed answering questions, getting feedback, and having a chance to show off a little bit.”

And rightfully so. They did an amazing job—not just building competitive machines, but embodying the kind of teamwork, persistence, and curiosity that robotics education is all about.

I left the showcase feeling hopeful and energized. These students aren’t just tomorrow’s engineers—they’re already innovators today. I look forward to seeing where they go next and to being part of more events like this that connect education with opportunity, one robot at a time.

Oh and Cooper… if you are reading this… I’ll be waiting for you in two years when you attend SBU and come to work in my lab!

 

 

Combining 3D Printing and AI

As my faithful followers will know, I’ve been reconditioning some used 3d printers.  This week I’ve been working on an old Ultimaker 2+.  To get it to print something I had to load a file to a SD card, as this printer does not have the native ability to connect to the computer or network.  I did not have an SD card reader on my Mac at first, so I printed some files that were already on the card – and after some poking and prodding that did eventually work, but the files were uninteresting.  Around this time, my coworker, Victoria Pilato let me borrow her SD card reader, so I happily went looking for something to print from Thingiverse – found a cute little fancy statue and set to printing.

Then I stopped the print job…

Even though the files on the card had been boring, they were at least printing ok.  This time it was trying to print from a few mm above the print bed.  I readjusted the bed calibration (which seemed fine, as I had already done it earlier), and still had the problem.

Next morning. I’m feeling ready to tackle this.  I ask ChatGPT.  It suggests my bed could be off or POSSIBLY some weird thing could be in the G Code telling it to start the first row higher on the z axis.  I poke around in Ultimaker Cura, trying to find the issue, but I don’t see the problem.

  • AI Fixes the G CODE

It occurs to me the maybe ChatGPT can just read the G Code.  I dump it on the AI and turns out that it can.  It tells me mostly everything looks alright… BUT it says that it is configured for the Ultimaker 2+ Connect, which has auto leveling, where my model does not and that I should fix that in Cura.  Now in Cura, I can clearly see that I have it configured for the correct printer, not the Connect.  So, I ask GPT to fix the code directly, instead of me doing it in Cura.  And it does.

 

  • AI Suggests a New 3d Model

I have already skipped the fancy little model, and been working on a simple Benchy today.   After it provides me with the new code… and that code is happily chugging away on the printer, I ask GPT if it can alter the code to actually change the shape of the model.  It says it can not, tells me what programs to use… and yet, still offers to make a prototype.  I say sure. It says it will give me code, but instead gives me an image.  Again tells me it get generate an .stl file, but instead makes a placeholder, with another image.

Oh by the way, I’ve told it to merge the Benchy with an UFO – because, why not?

UFO infused benchy as created by GPT. It has saucer element both around the base of the boat and above almost like an umbrella.

This is an AI rendered image, not a picture of a printed model.

  • AI Converts a 2d Image into a 3d Render 

So at this point, I’m pretty happy with the cute Benchy from Outer Space, so I decide to download that image and bring it into Meshy.  Meshy is a browser based AI 3d model generator, that can either make a model from a text prompt or from an image.  I’ve previously made a little seawolf with it before, starting from an image.

original wolfie picture

image

3d creation by Meshy

meshy model from image

printed meshy model with no clean up attempted

printed meshy model

The one thing I wasn’t thrilled about the first time I printed Wolfie straight from Meshy, was that I thought it lost some of the detail and looked a bit blobby.

From inside Meshy, I loaded the picture I wanted it to generate a 3d model from.  Similar to when you ask any AI to make a model, it gives me 4 choices.  You can see that they are doing the usual AI thing, and not really, really, trying to just make what you gave it, but providing you with some creative adaptations.

AI created versions of UFO/Benchy

Some of the choices are already less blobby than others, but I really wanted the one that looked the most like what GPT had made, so I went with the one in the top left – which already looks lumpy.

UFO infused benchy as created by GPT. It has saucer element both around the base of the boat and above almost like an umbrella.

  • AI Makes Recommendations to Refine Meshy Model

Me: “I have brought in a model into Blendr from meshy. The model looks kind of globby. Is there an easy way to clean it up so it looks more defined?”

GPT:

Top Reasons for “Globby” Models from Meshy and How to Fix Them

🟠 1. Surface is Too Smooth (Normals or Shading)

  • Meshy sometimes auto-applies smooth shading, which can make details look melted or blobby.

✅ Fix in Blender:

  1. Select the model in Object Mode.

  2. Go to the left toolbar or press Right-clickShade Flat.

    • Instantly sharpens up the look by reverting from smooth shading.


🟠 2. Mesh is Over-Triangulated or Dense

  • Meshy-generated models often have millions of triangles, making them look blobby or noisy.

✅ Fix:

  • In Edit Mode, run:

    • F3 → search Decimate Geometry (non-destructive, try ratios like 0.2–0.6)

    • Or use Remesh Modifier (Voxel mode, low voxel size like 0.050.1)


🟠 3. Normals Are Messed Up

  • If the surface has weird lighting or jagged shading, the normals may be flipped.

✅ Fix:

  1. Go into Edit Mode (Tab)

  2. Press A to select all

  3. Press Shift + N to Recalculate Normals


🟠 4. Model Scale Is Off

  • Sometimes imported models are tiny or huge and look off.

✅ Fix:

  • Press N for the sidebar → check scale

  • Reset with Ctrl + AApply Scale


🧠 Tip: For 3D Printing

  • Use Mesh → Clean Up in Edit Mode to:

    • Delete loose geometry

    • Fill holes

    • Merge by distance

    • Remove doubles


If you want, you can send me a screenshot or description of what it looks like, and I’ll give you a specific cleanup sequence.

Or — I can write you a custom macro that does it all in one go!

Me: Macro???  Yes please!

Here is the gold — lol  (I didn’t know you could throw a python script into Blendr.  Even if you did, maybe you hadn’t thought to have AI make it for you)

—————————————————————–

import bpy
import bmesh

# Get active object and ensure it’s a mesh
obj = bpy.context.active_object
if obj and obj.type == ‘MESH’:

# Ensure we’re in Object Mode
bpy.ops.object.mode_set(mode=’OBJECT’)

# Shade Flat
bpy.ops.object.shade_flat()

# Add Subdivision Surface Modifier (non-destructive)
subdiv = obj.modifiers.new(name=”Subdivision”, type=’SUBSURF’)
subdiv.levels = 2
subdiv.render_levels = 2

# Enable Auto Smooth to preserve sharp edges
obj.data.use_auto_smooth = True
obj.data.auto_smooth_angle = 1.0472 # 60 degrees in radians

# Go into Edit Mode for cleanup
bpy.ops.object.mode_set(mode=’EDIT’)
bm = bmesh.from_edit_mesh(obj.data)

# Merge by distance to remove doubles
bmesh.ops.remove_doubles(bm, verts=bm.verts, dist=0.0001)

# Recalculate normals
bmesh.ops.recalc_face_normals(bm, faces=bm.faces)

bmesh.update_edit_mesh(obj.data)
bpy.ops.object.mode_set(mode=’OBJECT’)

print(“✨ Cleaned, smoothed, and preserved edges!”)

else:
print(“⚠️ Error: Please select a mesh object in Object Mode.”)

—————————————————————–

Here is what it looks like after applying the script:

It isn’t perfect… but it is a whole lot better.  BTW – Blendr is free, Meshy has a paid version that very well make my models look better, but I am using the limited free version, the original Benchy model is free from Thiniverse as are many other items (though tips are encouraged (remember beerware??)). I pay for ChatGPT – but I suspect this could have been achieved from the free version as well.

After this, I brought it over to Formware to do some last minute manifold repairs on the stl file.

“You can use this free online tool without hassle to fix your stl files quickly. Files are deleted after download or otherwise after 6 hours to keep our systems clean. The fixer accepts binary and ascii .stl files.”

And here is the print after the supports are removed:

UFO/Benchy printed out

 

 

AUTODESK: Make it Home Student Design Contest

 

https://www.instructables.com/contest/makeithome2025/

(from the website:)

In this contest your challenge is to design an affordable housing solution that could also address another problem. We are giving away more than $50,000 in prizes to the best entries!

Designing and building permanent, supportive housing is a collective effort. How might we reimagine policy frameworks and think differently across a variety of industries and sectors to address this problem? What if affordable housing could solve other problems in a community?

We are excited to launch a new chapter of the annual Make It Real contest with NFL quarterback Joshua Dobbs, who studied aerospace engineering as an undergrad and interned at NASA. In the new “More Than a Destination” film series, Josh visits Factory_OS, a California-based affordable housing builder, located inside a repurposed shipyard hangar with a fascinating history. Watch the films to learn more about the problem behind this year’s challenge!

Individual Prize: Best use of CAD or BIM Tools

Individual Prize: Best Physical Prototype

Individual: First Prize Scholarship Award

Best Team Use of CAD or BIM Tools

Team: First Prize Scholarship Award

 

SBU students can contact jennifer.adams@stonybrook.edu if you would like assistance from the SBU Libraries in entering this contest.

Closes June 30th!

 

Bringing Two UP Mini 2 Printers Back to Life

This past week I’ve rolled up my sleeves and tackled checking whether two new-to-us UP Mini 2 3D printers from Tiertime were still functional. These compact machines had been tucked away for quite some time—unfortunately, in a room that suffered a significant flood. One unit had been stored in its original box (lucky break), while the other had been sitting exposed on the floor. Needless to say, I had some concerns about their condition.

The two 3d printers sitting on a table.

I started with the boxed unit, which appeared to be in better shape. When I loaded a fresh spool of filament, it threw an “Insufficient Material” error—even though I could clearly see the filament had been fed properly. The fix? Manually entering the filament weight (500g), which this printer model requires in order to track usage accurately. Once that was squared away, I ran a test print—not the traditional Benchy, but a Calibration Mouse. (Let’s just say the loaded green filament gave it some extra personality.)

The second printer needed more attention. The nozzle was gummed up and required a good cleaning, along with a bit of tinkering to get everything running smoothly. After some patience and perseverance, I managed to coax it back to life. The reward? A second successful test print—this time in red.

Why the UP Mini 2?

The UP Mini 2 might not be the flashiest printer on the market, but it’s a solid workhorse, especially for educational and makerspace environments. Its fully enclosed build area makes it safer and better for temperature-sensitive materials like ABS. It’s compact enough for a desktop yet capable of reliable, high-quality prints with minimal fuss. Features like HEPA filtration, auto-leveling, and built-in touchscreen controls make it accessible for beginners while still meeting the needs of more advanced users.

With both printers now tested and functional, our upcoming space has two more reliable machines ready to support student projects, prototyping, and creative exploration.

picture of the calibration mouse, that includes some angels and lengths.

 

 

Me holding a little green calibration cat.

 

Me holding both the red and green calibration cats.

 

Making Your 3D Prints Mouth-Safe and Food-Friendly: What You Need to Know

When I printed my first double-chambered flute on my 3D printer, I was thrilled to see it come to life. But as I prepared to play it, a question nagged at me: Is it actually safe to put this in my mouth? That one query opened a whole new world of research into the safety of 3D printed objects—especially those that come into contact with food or the human body.

Me with my new 3d printed double flute

(who am I kidding – this is made of PLA and I did blow into it before even considering all of this.)

3D printing offers incredible possibilities—from custom cookie cutters to personalized water bottle lids and musical instruments. But just because something can be printed doesn’t mean it should be used without considering safety, especially for anything that goes near your mouth or food.


Understanding Food Safety Terms

Before diving deeper, let’s get a handle on a few critical terms:

  • Food grade: A material is safe for human consumption or is allowed to come into contact with food.

  • Food safe: A food grade material that meets safety requirements for its intended use and won’t pose a health risk.

  • Food contact surfaces: These are surfaces that touch food or your mouth and must be nontoxic, durable, and easy to clean.

These terms aren’t just relevant for kitchen utensils—they’re also important for things like wind instrument mouthpieces that see prolonged contact with your lips. The underlying concern is migration—the transfer of small particles from the printed object into food or the body.


Why Basic 3D Prints Aren’t Safe (Even with “Food Safe” Filament)

Here’s the truth: even if your filament is labeled “food safe” or “food grade,” that doesn’t make your printed object safe by default.

  • Fused Filament Fabrication (FFF) creates objects with microscopic crevices between layers—perfect hiding spots for bacteria and mold.

  • These gaps are hard to clean, even with a dishwasher (if the material can tolerate high temperatures).

  • Studies show that untreated 3D prints can harbor significantly more bacteria than coated or properly treated ones.

This is especially problematic for items with repeated or long-term contact, like flutes or cup rims.


Material Considerations: Not All Filaments Are Created Equal

Some materials are potentially safer than others:

  • PLA, PETG, and PP (polypropylene) are commonly considered food-safe options—but only under certain conditions.

  • ABS and ASA? Not safe for food contact.

  • Filament colorants and additives matter too. Even if a base polymer is safe, pigments or other ingredients might not be. Always check with the manufacturer.

For commercial or high-stakes use, look for certified materials—like those compliant with FDA CFR 21 (U.S.) or EU 10/2011. Just be aware: “compliant with” is not the same as certified or approved.


The Printer Itself Can Be a Source of Contamination

Even if you’re using the right filament, your printer might not be clean enough for safe food contact.

  • Replace brass nozzles with stainless steel—brass can contain lead.

  • Avoid cross-contamination from previous filaments or lubricants.

  • Keep your printer—inside and out—as clean as possible.


Why Cleaning my Flute isn’t Good Enough

  1. Layer Lines = Micro-Crevices

    • FDM/FFF 3D prints, including PLA, have tiny ridges and gaps between layers.

    • These crevices trap saliva, moisture, and bacteria, which can’t be fully removed—even with scrubbing, soap, or alcohol.

  2. Biofilm Formation

    • Bacteria can form biofilms—slimy protective layers—that resist even harsh cleaners.

    • Over time, this can lead to odors, staining, or health hazards.

  3. Porosity Makes Sterilization Difficult

    • Unlike glass or stainless steel, PLA is porous.

    • That means typical cleaning doesn’t penetrate deep enough to sanitize completely.

  4. Heat-Based Cleaning = PLA Deformation

    • PLA softens at ~60°C (140°F), so it’s not dishwasher-safe or steam-sterilizable.

    • Autoclaves, boiling, or hot water will warp or weaken your print.


What About Specialty Cleaning Methods?

  • Isopropyl alcohol or diluted bleach may reduce surface bacteria, but again, they can’t reach deep into the layer gaps.

  • UV light (like toothbrush sanitizers) may help somewhat, but only on exposed surfaces, not inside layer lines.

  • Long-term, repeated cleaning degrades PLA, especially if it’s soft or has fine details.

The Essential Step: Surface Coating (Sealing)

Because of the porous nature of FFF prints, sealing the surface is key to making your item mouth-safe or food-friendly.

  • Use clear, food-grade epoxy resins or polyurethane coatings that are explicitly labeled food safe.

    • Popular choices: ArtResin (compliant with FDA CFR 175.300), MasterBond EP42HT-2FG.

  • Avoid coatings that are only labeled “non-toxic” (like Mod Podge or Krylon sprays) unless they specifically state food safety.

  • Be cautious—liquid resins are toxic before curing. Use gloves, a respirator, and follow all curing instructions carefully.

    • An improperly cured resin can leave your item dangerously unsafe.


Limitations of Coated Prints

Even the best coating isn’t magic. Your sealed object won’t be as durable as a ceramic plate or stainless steel fork.

  • Most coatings aren’t heat-resistant—don’t microwave or wash in hot water.

  • Scratches or wear can expose the porous print beneath, allowing bacteria to creep in.

  • Not ideal for cutting boards, utensils, or anything subject to frequent abrasion.


Considering Alternatives and Safer Workarounds

If sealing sounds too risky or complicated, consider other options:

  • Use food-safe sealants like water-based poly coats (with proper research and testing).

  • Make molds using your 3D print and cast with a food-safe silicone or material instead.

  • Be cautious with SLA (resin) printing. Even when coated, most resins remain inherently toxic and prone to wear.


Final Thoughts: Is It Worth It?

Yes—but only if you do it right.

Creating food- or mouth-contact 3D prints is totally possible. But it requires more than just buying “food-safe” filament. It’s about:

  • Choosing the right material (and checking additives)

  • Keeping your printer clean and lead-free

  • Applying a properly cured, food-grade coating to seal your print

And even then—understand the limitations. These prints won’t last forever and must be treated with care. For commercial products, certification and compliance are non-negotiable.

When in doubt, always check the documentation, contact manufacturers, and—above all—prioritize safety.

I have a new role! Director of Innovation!!

I’m still here at Stony Brook University, but after many, many years with the Department of Information Technology, I have made the move to Stony Brook University Libraries!  I started off my professional life, eons ago, between working at the library and working at bookstores, and my personal hobby is to collect signed first edition SF&F books, so this is a great fit for me.  Especially since I get to continue doing what I deeply love doing, which is pairing art and technology to bring the folks of the university the COOLEST STUFF.

With the support of the Dean of Libraries, I will be spearheading bringing multiple creative and innovative spaces to the campus, at locations that are both centralized and welcoming to all students, faculty and staff.

We will certainly have 3d printers, textile making equipment, VR/AR spaces, sensors and other electronic doo dads, vinyl cutting, button making, and so much more!

I personally welcome any faculty who have ideas about alternative assessments, field trips, or entire modules that they want to have taught in the space, to please come and have a consultation with me to discuss what we can do together to make some dream projects for your course a reality.  We are still in the design and renovation stages of the space – and the squeaky wheel could very well get the grease in what goes into the space first.

picture showing the vinyl signage, created on a cricut, that says Jennifer L. Adams, Director of Innovation.

Embedding Video into a webpage or brightspace… from Google Drive

So, as is normal, there are different ways to tackle embedding video using systems here at SBU. I’m just going to look at google drive embeds.  When you embed a video, it plays right there in the page, not something you have to click on which then opens up a new page or downloads the video to view it.  Like this:

(BTW – this video I am using in this demonstration was made via copilot as I outlined in the previous blog post.)

So how did I do that?  One usually won’t just drop a video into this blog site, as it will likely be too big.  So the first thing I did after making the video, is drop it into my google drive. Make sure you share it to anyone that has the link or anyone at SBU as appropriate.

The thing is, the embed code from google drive is a wee bit hidden.  You have to click on the three dots next to the file name, open it in a new tab, then click the three dots above the player area and then at the very bottom of the menu you can find the embed code.  You will need to highlight and copy the code out from there.

Now that you have the embed code, what next?  Well, here in the blog I switch from a Visual View to a Text view, and that shows me the html for the page and I just copy it in.  It might look a little scary, but right after you paste, you can switch right back to the Visual Mode and safety.

In Brightspace, you go to the area you want to have the video,  find this icon:

the "insert stuff" icon from the wysiwyg menu in brightspace.

This is the “insert stuff” button which can be found in just about any text entry window where you can format the text.

List of the stuff you can insert. Embed Code is at the bottom.

Enter Embed Code is at the bottom

this shows the interface where you paste the embed code.

Paste in your code and hit next.  It gives me a warning that you are getting media from another site.  Click Allow. Click Insert.  I got that warning again, and I told it to allow it again.  And now it looks like this:

screenshot showing the video embedded into a brightspace page.

Google Drive auto captions the videos that you put in the drive. (You should still check and edit them.)

You do not get all the same engagement tools and analytics that you get in echo360  embedding, but I felt that this was still worth sharing.