How can you get your internal files ready for AI tools to find them accurately at your nonprofit? Are the permissions set correctly to protect personally identifiable information? Can you use AI to help find files with “open” permissions?
Listen to Podcast
Like podcasts? Find our full archive here or anywhere you listen to podcasts: search Community IT Innovators Nonprofit Technology Topics on Apple, Spotify, Google, Stitcher, Pandora, and more. Or ask your smart speaker.
Is your file storage architecture ready for AI search?
Carolyn sits down with Steve Longenecker, Director of IT Consulting at Community IT Innovators, to tackle a question that’s suddenly urgent for many nonprofits: now that AI tools like Microsoft Copilot and Google Gemini can search your entire file system, are your permissions actually set up correctly? How can you prep your file permissions to use AI tools securely?
The conversation covers the practical steps nonprofits can take to assess and clean up their SharePoint and Google Workspace permissions before — or after — turning on AI. Steve and Carolyn discuss:
- Why AI tools like Copilot only surface files users are already permitted to see — and why that’s not as reassuring as it sounds.
- The “security through obscurity” problem: how files that were harmlessly buried for years can suddenly become visible to anyone.
- How Microsoft tracks “anyone at my organization” share links — and why you should change your default sharing settings now.
- What Restricted SharePoint Search is, and how it can help you safely roll out Copilot site by site.
- Practical first steps for nonprofits with messy, organic SharePoint environments.
As Steve puts it, old SharePoint architecture represents technical debt that’s going to have to get paid down eventually — and AI may be making that day come sooner.
Resources Mentioned:
- Community IT’s Microsoft Tools Resource Library for Nonprofits: https://communityit.com/microsoft-tools-for-nonprofits/
- Microsoft Restricted SharePoint Search — overview for organizations rolling out Copilot: https://techcommunity.microsoft.com/blog/microsoft365copilotblog/introducing-restricted-sharepoint-search-to-help-you-get-started-with-copilot-fo/4071060
- SharePoint permissions governance — a conceptual overview for site owners and leadership: https://support.microsoft.com/en-us/office/overview-site-governance-permission-and-sharing-for-site-owners-95e83c3d-e1b0-4aae-9d08-e94dcaa4942e
Presenters

As Director of IT Consulting, Steve Longenecker divides his time at Community IT primarily between managing the company’s Projects Team and consulting with clients on IT planning. Steve brings a deep background in IT support and strategic IT management experience to his work with clients. His thoughtful and empathetic demeanor helps non-technical nonprofit leaders manage their IT projects and understand the Community IT partnership approach.
Steve also specializes in Information Architecture and migrations, implementations, file-sharing platforms, collaboration tools, and Google Workspace support. His knowledge of nonprofit budgeting and management styles make him an invaluable partner in technology projects.
Steve is MCSE certified. He has a B.A. in Biology from Earlham College in Richmond, IN and a Masters in the Art of Teaching from Tufts University in Massachusetts. He is a regular contributor to the Tech Topics podcast at Community IT, drawing on his decades of experience interacting with clients on complex projects and IT management challenges. This conversation on how to prep your file permissions for AI tools is on many clients’ minds and he was happy to share his experiences.

Carolyn Woodard is currently head of Marketing and Outreach at Community IT Innovators. She has served many roles at Community IT, from client to project manager to marketing. With over twenty years of experience in the nonprofit world, including as a nonprofit technology project manager and Director of IT at both large and small organizations, Carolyn knows the frustrations and delights of working with technology professionals, accidental techies, executives, and staff to deliver your organization’s mission and keep your IT infrastructure operating. She has a master’s degree in Nonprofit Management from Johns Hopkins University and received her undergraduate degree in English Literature from Williams College.
She was happy to have this podcast conversation with Steve about how to prep your file permissions for AI tools being adopted at your nonprofit.
Ready to get strategic about your IT?
Community IT has been serving nonprofits exclusively for twenty-five years. In fact, we celebrate 25 years of Community IT this month and all year. We offer Managed IT support services for nonprofits that want to outsource all or part of their IT support and hosted services. For a fixed monthly fee, we provide unlimited remote and on-site help desk support, proactive network management, and ongoing IT planning from a dedicated team of experts in nonprofit-focused IT. And our clients benefit from our IT Business Managers team who will work with you to plan your IT investments and technology roadmap if you don’t have an in-house IT Director.
Being 100% employee-owned is important to us and our clients. It is an important aspect of our culture as a business serving nonprofits exclusively for 25 years. Unlike most MSPs, Community IT considers budgeting and strategic management a major part of our services to our clients.
We constantly research and evaluate new technology to ensure that you get cutting-edge solutions that are tailored to your organization, using standard industry tech tools that don’t lock you into a single vendor or consultant. And we don’t treat any aspect of nonprofit IT as if it is too complicated for you to understand.
We think your IT vendor should be able to explain everything without jargon or lingo. If you can’t understand your IT management strategy to your own satisfaction, keep asking your questions until you find an outsourced IT provider who will partner with you for well-managed IT.
More on our Managed Services here. More resources on Cybersecurity here.
If you’re ready to gain peace of mind about your IT support, let’s talk.
Transcript
Carolyn Woodard: We were going to talk about cleaning up data permissions.
Steve Longenecker: Yeah.
Carolyn Woodard: How to start doing it.
Welcome everyone to the Community IT Innovators Technology Topics podcast. I’m Carolyn Woodard, your host, and I’m happy today to be joined by my colleague, Steve Longenecker, who is our director of IT Consulting and a longtime employee of Community IT, employee owner. And we’re going to discuss a little bit more about how to clean up your permissions if you are worried that AI is going to surface some files or information that not everyone at your organization should have access to.
Steve Longenecker: In general, our recommendations right now to our clients is that the value of having your AI tool be integrated with your main productivity platform is so strong that that’s probably the place to start. So if you’re a Microsoft 365 customer for email and files, you might as well just use Copilot, even if it’s not the strongest AI in the world. Although I think a lot of people underestimate Copilot, or at least I have. I think it’s not as bad as people might have said or experienced in the past. Partly you can pick the specific model that Copilot’s using more easily than I thought. So that’s one thing.
Similarly, though, just to close that loop, if you’re a Google customer using Gemini, it makes a lot of sense too. So then when you ask Gemini a question, similar to asking Copilot a question, it can draw on the data that’s in your Google Workspace or in your Microsoft 365 tenant, respectively. And if that’s where your data is, that makes for a more powerful AI experience if you’re asking it questions pertaining to your work data.
Now, if you’re asking a general question like you’re trying to vibe code something, or you just want to know a summary of a bunch of web pages, then you can use any model and it doesn’t matter, your work data is not pertinent to that. But if you’re asking a question about “find me the files that we have as an organization about this customer or about this program” or “summarize them for me,” then already having access to those files or to your emails—that’s really a powerful thing about Copilot for me is that it’ll check my work against what I’ve been emailing with people, and that’s kind of cool.
But yeah, then the risk that this conversation starts with is: all right, so you’ve hooked up, you’re using an AI agent that can see your company’s files. Well, what are we worried that it’s going to surface information that you actually shouldn’t have access to? And both Gemini with Google and Copilot with Microsoft promise that they will only surface data that the user of the AI is allowed to see. So if you ask a question about budgets of Copilot, it’s not going to show you budgets that list people’s salaries—only the CFO and the CEO or other departments. If you’re not supposed to see those salaries, Copilot won’t show them to you.
The risk and concern is that organizations in the past may have been sloppy about those permissions because Copilot or Gemini wasn’t in the picture. And so no one even knew that this library that has all this budget information even existed, besides the Microsoft team that the CFO spun up and invited the accountant and the CEO to join. All right, they are the only ones that even know this team exists. No one else is really—not that it’s like a state secret, but it’s not advertised, it’s not in any indexes.
But when they created the team, it was not really a thought-out thing. They just clicked on the public team instead of the private team. And so that immediately sets up some permissions. And if no one knows that the team really exists and is looking for it, then it doesn’t matter. But now that Copilot is on the scene, it does matter because Copilot views that team as public. You said it was. It maybe is artificially intelligent, but it’s not intelligent enough to second guess the motives, it’s just honoring the permissions that you gave.
So almost any AI implementation plan will start with that kind of a readiness question. Do you feel confident that your file libraries are secured appropriately so that the AI is going to honor the permissions, but the permissions are correct?
One thing that was really interesting to me that I learned recently—and I want to give a shout out to Dan Shank-Evans, the chief information officer at the Carnegie Endowment for International Peace, it’s a Community IT client. I did actually ask him if I could give him this shout out because I wouldn’t do that otherwise, but he said that’s fine. I had told him that if you share a file with someone, like if I share a file with you and I use what SharePoint calls the “anyone at Community IT” link—that’s one of the share links that you can share a file with. And the idea here is that this link will work for anyone at Community IT.
Carolyn Woodard: Yeah. But I’ve just shared it in Teams or you send it in.
Steve Longenecker: Yeah, I sent it as an email to you, or I sent it as a private chat to you. So you’re the only one with this link. And this link is extremely long and random. The URL starts with some familiar names and stuff, but then it’s just a bunch of alphanumeric characters. And so I had told Dan, well, that link says that anyone at Community IT can have access to this file. And so Copilot’s going to find that file if someone else, besides you, asks a question pertinent to that file.
If that’s your default sharing link, you just use it because it’s like, yeah, I want Carolyn to be able to see this. She’s in Community IT, so the link’s going to work great for her. It’s the simplest thing, no extra clicks, just grab the link and send it to her. And I know she’s not going to forward this link to anybody else because this is private. Done and done.
Well, I told Dan that’s a problem. So what we need to do is find all those share links and weed them out. Which, by the way, you can do that in PowerShell. You can get a listing of all the share links that have that particular thing, and then you can figure out which ones actually should be changed to specific people links, track them back. Or maybe unshare them all together because it’s no longer relevant to share it anyway. It’s like a budget from three years ago, and we’re not actually actively collaborating. We’re not going to delete the file, but we can remove the sharing.
Carolyn Woodard: Like make a policy, anything over four years old. Maybe something.
Steve Longenecker: There’s different ways that you could do that. So, anyhow, Dan was really concerned about this, and he did some experimenting. What I promised him would happen wasn’t happening. So he did some more research and he found out that actually Copilot—and I can’t speak to how Gemini would work with this—but with Copilot at least, it actually keeps track of who the link has been passed to.
So if I have created an “anyone at Community IT” link to a file and I’ve shared it with you, Copilot knows—it can see the email with the link in it that is in your mailbox. So it knows that that link works for you.
Carolyn Woodard: But if you haven’t actually forwarded, “Joe Random” asks for it, it wasn’t shared with Joe Random.
Steve Longenecker: That’s right. If you forward the link to someone else, then Copilot will find it. But it actually—that just seems amazing. Not so much that it would—I mean, just the level of having to enumerate all this information, but that’s what artificial intelligence is good at. It can look at a million things at once very quickly and find the pattern. So that’s one of the patterns it looks at.
So it’s not as dismal a situation as I was promising to Dan. Now, I pointed out to him, hey, but we’ve all been in the situation where a thread that’s 20 emails long has now turned a corner and gone to a different issue. And someone thinks, “Oh, now that the thread is asking this question, I should CC so-and-so.” “Let me CC our colleague, Henry.” Now Henry gets this long thread. And if Henry bothers to scroll through the whole email, he realizes that this conversation has taken a number of meanders. And down there at the bottom is some information that he shouldn’t see, but he can.
So it’s still a risk, and you’re still advised to maybe start new threads sometimes when you do that, or pay attention to that. And just in general, avoid the “anyone at my organization” link unless you actually mean that and use specific people links. I would say change the default on your sharing links. The default is often “anyone” or “anyone at my organization” and make the default “people with existing access” so that they have to actually change it to something if they mean to share it more widely than that. That’s what we go with now for the default sharing link.
But back to your question, yes, you can—again in Microsoft I’m more familiar with—you can run PowerShell commands to get lists of all of these sharing links and go through and weed them out. You can look at the different libraries and see if folders have sharing that they shouldn’t, or if whole libraries have sharing.
One thing that Microsoft does that I think is good in this respect is—this is not something that I see turned on much at all, but it can be turned on. You can turn on what’s called restricted SharePoint search. It’s a global admin setting. So it applies to the entire tenant. I think the idea here is that when you turn it on, then Copilot can’t see any sites. It can’t see SharePoint, which of course is the whole point is that you want Copilot to be able to see your sites, but then you can add to the list the sites that you have vetted and you’re comfortable turning them on.
I think it’s that it turns off for everything, and then you turn things on site by site. In any case, you can validate the governance for a site and then add it to the allowed list, and at some point you’ve done it for all of your sites, and at that point you can turn the restricted search off because you’ve validated everything.
So that’s probably something that if I were concerned—and I say that because it is an if—there are clients that have set up their SharePoint infrastructure with great care and have already been very conscious well before Copilot, diligent about the permissions and security. And so they can turn on as long as Copilot respects permissions, which it does, they can turn on Copilot for their end users with confidence. But we definitely have clients that are not and should probably use the tools that these—and I think Gemini has similar tools, I’m just not as familiar with it—where you can manage the transition.
Carolyn Woodard: I feel like this conversation speaks a little bit to the training, maybe onboarding, and revisiting your training of: how do we share files with each other and where do we store. If you join an organization and you are on the program team, and then you have a SharePoint folder that’s your folder for your team—a lot of those things in my experience is just kind of left to chance. Maybe somebody else in your department will tell you how it works.
Maybe that’s something in going along with looking at these permissions is to revitalize or create an onboarding reference that everyone can look at of: here’s how we share. I get confused in Teams because you share a document with a chat that you’re on in Teams with your colleagues, and they have access to it. And then I don’t know, it gets confusing, like, well, where do you put that file? And then sharing the file is like, oh, you already shared that with somebody else. If you don’t have a philosophy and a policy of “this is how we use Teams, this is how we use SharePoint, this is how we want you to share and store your stuff you’re working on, this is how we want you to archive it when it’s done,” then it’s really hard to know. If you have a staff of even just 15 people, they’re probably sharing things 15 different ways.
Steve Longenecker: All of that is true and has been true for a long time. I think that the main shift, the imperative, is that before we had AI’s ability to find needles in haystacks very quickly with natural language requests, security through obscurity was a real thing. And now it’s just not.
If there’s some folder buried deep in a file structure that has personal information—PII is an example.
Carolyn Woodard: So like Social Security numbers of all of your colleagues.
Steve Longenecker: It used to be that we did it that way. We organized things by that way, and we didn’t worry about it because it was fine, and now the files are still there. They’re still in that old library, and it might not even be a library that anyone’s gone to. Maybe it used to be on a file server and we migrated it to SharePoint and it got ported over, and it’s just sort of there, and no one knows about it, and it doesn’t matter, but now it does. So maybe you need to like—and there’s other things that both Microsoft and Google can do, like looking for PII, looking for Social Security numbers, like where are these things and are they somewhere that they shouldn’t be? Every organization will have files somewhere. Now it may not be in their SharePoint, it might be in their HR system, but resumes, they’re going to be saved, but are they saved somewhere securely? Well, that’s the question.
Carolyn Woodard: So I think I have a last question for you on this, and it’s kind of a hypothetical, but I think it might apply to some nonprofits. If you are at a nonprofit and your SharePoint is a mess, it’s just grown organically, willy-nilly, no one explained it to you. And maybe you don’t have an MSP or there’s no IT person—IT is under your CFO, for example. And you’re on that leadership team and you’re worried about this issue, that the permissions will come up. You have staff who are looking with Copilot to find information on this program from five years ago because it could help me today. What would be your advice for a first one or two steps that executive team should take to start to deal with this?
Steve Longenecker: A lot of the steps I would advise do require a certain amount of technical ability. I would like to look at the structure of the SharePoint infrastructure. The model that is recommended now—which was not the model that SharePoint sites that were built 10 years ago were necessarily built by—is that you have a different site for different permission groups. So the finance team would have a finance site with the finance library, and the finance site has a specific membership that is very visible.
Maybe sites that were built 10 years ago might just have a site for the whole company, and then there’s just a finance folder where the permissions are different for that subfolder than they are for other things. Those can be very difficult to—it’s not difficult to find out what the permissions of a specific folder are, but each finding out the permissions of that specific folder is like a series of clicks and it doesn’t scale very well because that’s just one folder. Now you have this other folder and this other folder.
So how do you find that out? It might actually in some ways be wise to take that whole site invisible to Copilot and then little by little move the stuff out to other places that are secured and then make those places visible one by one. It depends on how woolly your SharePoint infrastructure is, but in your hypothetical situation was a really rangey, out of control organic SharePoint. It probably does make sense to get your arms around that before Copilot can start. And fortunately, there are ways in SharePoint to say, “hey, we want to make this site unavailable to Copilot” without doing anything else. That’s just a simple step. And then you can start to port the stuff out and clean things up and start making things available.
Carolyn Woodard: Well, that makes sense. And maybe this is your wake-up call. I mean, I guess with a 12-step program, the first step is to know you have a problem.
Steve Longenecker: Right. The other thing I would do in that situation is deputize—you can’t really do this yourself with your own leadership because you have access to all this stuff—but deputize an intern to hammer away at Copilot and see kind of what stuff they can find, and then bring that material to you. Especially a young intern might be good at asking Copilot to find stuff and see what they can find. It’s kind of like the old “red team” approach to security. You need to have a team that’s trying to get through your stuff—they’re on your side, but you’ve deputized them to probe your defenses. That would be another interesting tack to take.
And that’s kind of what Dan was doing. He was like, “Yeah, Steve told me this, it makes sense, but let me test it.” And it wasn’t working the way I said. And so that’s when he did a little more research and found out that actually Copilot tracks where the link has gone. Have someone who shouldn’t be able to find stuff and see if they can and see what comes up. It may be that your situation is not as bad as it seems because maybe the permissions are working.
Carolyn Woodard: And usually there is someone who knows something about SharePoint. So maybe find the person on your staff who knows the most about it and ask them to help you understand where the risks might lie. Or maybe, as a leadership team, you talk about it and you’re like, “we need a consultant who’s going to just come in and help.”
Steve Longenecker: I do think that those sites that are built on the old paradigms—and we built some sites with those old paradigms, it was just a different time—but that idea of having a single company site with folders that have different permissions, that does represent a technical debt that’s going to have to get paid down sometime. Now, it may not need to get paid down today. It may not be your most urgent priority if it’s basically working for you. Copilot might be raising the stakes a little bit or making it more of an imperative. But even if it seems like the permissions are holding when you have your intern test it out, it still represents technical debt because you don’t have visibility, it’s hard to manage. At some point, you’re going to need to unwind that and move things out of that old architecture and into sort of the more modern architecture. SharePoint’s a great platform, but that original architecture didn’t hold up all that well. And there’s a reason that we don’t do it that way anymore.
Carolyn Woodard: I heard somebody say recently data is power. Our nonprofits have a lot more valuable data than they might think that they do.
Steve Longenecker: Data is risk. Risk and power.
Carolyn Woodard: AI is kind of forcing us to have those conversations. So thank you, Steve, so much for joining me today and explaining a little bit more about this and giving us some steps to think about.
Steve Longenecker: I hope it was helpful. Thank you, Carolyn.
As advocates for using technology transparently to work smarter, we’re practicing what we recommend. This transcript was edited lightly with the assistance of AI for clarity, and is not a verbatim transcript. The content was reviewed, edited, and finalized by a human editor to ensure accuracy and relevance.
Photo by Maarten van den Heuvel on Unsplash