AI Business Professional 온라인 연습
최종 업데이트 시간: 2026년03월09일
당신은 온라인 연습 문제를 통해 Microsoft AB-730 시험지식에 대해 자신이 어떻게 알고 있는지 파악한 후 시험 참가 신청 여부를 결정할 수 있다.
시험을 100% 합격하고 시험 준비 시간을 35% 절약하기를 바라며 AB-730 덤프 (최신 실제 시험 문제)를 사용 선택하여 현재 최신 47개의 시험 문제와 답을 포함하십시오.

정답: 
Explanation:
Prompt injection is a generative AI security risk where an attacker inserts instructions (often hidden in text, documents, webpages, or user inputs) to override or manipulate the assistant’s intended behavior. This can lead to unintended actions such as ignoring policy controls, producing unsafe outputs, or attempting to reveal sensitive information. Because generative AI systems follow natural-language instructions, they can be socially engineered to prioritize malicious content unless safeguards are in place. This is why prompt injection can cause data exposure (for example, attempting to extract confidential content from grounded sources) and can also embed harmful instructions that redirect the model’s behavior. In enterprise settings like Microsoft 365 Copilot, mitigations include grounding boundaries, permission trimming, content filtering, and instruction hierarchy (system policies over user instructions). From a business governance perspective, users should treat untrusted inputs (emails, documents, web text) as potentially hostile and apply least-privilege access and validation when using AI outputs in decision-making.
정답:
Explanation:
Microsoft 365 Copilot operates within two primary knowledge boundaries: its foundational large language model training data and the organizational data available within the Microsoft 365 tenant. However, Copilot strictly enforces Microsoft’s security and compliance model, meaning it only retrieves and uses data that the signed-in user is authorized to access.
When using the Work scope, Copilot combines the general knowledge it was trained on with organizational data such as documents, emails, chats, calendars, and files stored in Microsoft 365. Importantly, Copilot respects role-based access control and existing permissions. It does not surface information from content the user does not have access to.
Option A is incomplete because Work scope includes organizational data.
Option B is incorrect because Copilot does not access all tenant data indiscriminately; it is permission-scoped.
Option D is incomplete because Copilot also leverages its general training knowledge.
Therefore, the correct explanation is that Copilot provides responses based only on data the user can access, combined with its general training knowledge.
정답:
Explanation:
Microsoft 365 Copilot integrates with Microsoft Search to help users discover relevant content across their organization’s Microsoft 365 data estate, including SharePoint, OneDrive, Teams, and Exchange. When the objective is to determine whether similar training plans already exist, the appropriate action is to perform a search across organizational content.
Using Search allows Copilot to query indexed enterprise documents and return files, plans, or related materials that the user has permission to access. This supports content reuse, avoids duplication of work, and aligns with Microsoft’s guidance on leveraging organizational knowledge efficiently.
Designer is focused on visual content creation, Apps provides access to Microsoft 365 applications, and Pages is used for creating and organizing content within Copilot. None of these options are intended for discovering existing documents across the tenant.
Therefore, to identify similar existing training plans within your organization, the correct tool to use in Copilot is Search.

정답: 
Explanation:
Microsoft 365 Copilot Chat is included at no additional cost for users with qualifying Microsoft 365 subscriptions such as Business Basic, Business Standard, Business Premium, E3, and E5. Therefore, users without a Microsoft 365 Copilot add-on license can still access Copilot Chat.
Copilot Chat allows users to upload documents from their local device within a session for summarization, analysis, and drafting assistance. However, advanced enterprise-integrated features―such as creating agents that access organizational SharePoint folders―require a Microsoft 365 Copilot license.
Agent creation and enterprise data grounding leverage Microsoft Graph integration and are considered premium Copilot capabilities tied to licensed Copilot services.
This distinction reflects Microsoft’s licensing model: baseline AI chat access is included with qualifying subscriptions, while deep Microsoft 365 app integration and enterprise data agents require the additional Copilot license.
정답:
Explanation:
When building a custom analytics agent in Microsoft 365 Copilot that must process structured data from Excel files, advanced analytical capabilities are required. According to Microsoft AI Business Professional guidance, tasks such as performing mathematical calculations, generating aggregations, creating charts, and conducting structured data analysis require programmatic execution capabilities rather than simple text generation.
A code interpreter enables the agent to run Python-based analytical operations in a secure execution environment. This allows the agent to manipulate datasets, compute totals and averages, perform grouping and filtering, and generate visualizations such as bar charts or line graphs based on the Excel data. The interpreter bridges the gap between natural language instructions and executable analytical logic.
An image generator is designed for creative visual content and is unrelated to structured data analytics. Suggested prompts and templates improve usability and consistency but do not provide computational or visualization capabilities.
Therefore, to enable mathematical operations, aggregation, data analysis, and visualization of Excel sales data, the correct component to add to the agent is a code interpreter.

정답: 
Explanation:
Prompt scheduling in Microsoft 365 Copilot enables users to automate recurring AI-driven tasks such as generating reports, summaries, or updates. However, there are defined operational limits to ensure system performance, responsible AI usage, and governance compliance.
First, the most frequent interval at which a prompt can be scheduled is once per day. Scheduling more frequently, such as hourly, is not supported. This limitation prevents excessive automated execution that could strain system resources or generate unnecessary redundant outputs.
Second, a scheduled prompt has a maximum execution limit of 15 times. After reaching this threshold, the schedule must be recreated if further automation is required. This control ensures users periodically review and validate automated prompts for relevance, accuracy, and business alignment.
These constraints reflect Microsoft’s enterprise AI governance strategy: balancing productivity automation with performance optimization, cost control, compliance oversight, and responsible AI lifecycle management within Microsoft 365 Copilot environments.

정답: 
Explanation:
The correct answer is instructions because an agent’s persona―such as being warm, friendly, formal, or analytical―is defined through its behavioral and communication guidance. In Microsoft 365 Copilot, the Instructions field controls how the agent responds, including tone, style, personality, response structure, and interaction patterns.
The description typically explains what the agent does, not how it communicates. Capabilities define what tasks the agent can perform, such as summarizing documents or analyzing data. Suggested prompts provide example user queries but do not control the agent’s personality or communication style.
To establish a warm and friendly persona, the instructions should explicitly state guidance such as: “Respond in a supportive, conversational tone. Use clear and positive language. Be encouraging and approachable.” This ensures consistent tone across all interactions.
Defining persona through instructions reflects prompt engineering best practices in generative AI, enabling predictable behavior, improved user experience, and alignment with organizational communication standards within Microsoft 365 Copilot environments.

정답: 
Explanation:
When creating an agent in the Microsoft 365 Copilot app using a template, administrators or authorized users can configure specific behaviors and knowledge sources for the agent. Therefore, it is correct that you can choose which knowledge sources the agent can use. This ensures proper grounding in approved Microsoft 365 data such as SharePoint sites, OneDrive folders, or other organizational repositories. Controlling knowledge sources supports responsible AI principles, including data security, relevance, and compliance.
Agents created within Microsoft 365 are designed primarily for internal organizational use. As a result, sharing the agent with users inside the organization is supported, enabling collaboration and productivity improvements across teams.
However, sharing the agent with external users is not supported by default. External sharing would introduce security and compliance risks, so Microsoft enforces tenant-level boundaries to protect enterprise data.
These capabilities reflect best practices in managing AI prompts and agents securely within enterprise environments.
정답:
Explanation:
Microsoft provides centralized activity management controls through the My Account portal, which allows users to manage privacy settings, activity history, and data associated with Microsoft 365 services, including Copilot. When the requirement is to delete all conversations with minimal effort, the most efficient method is to use the account-level activity management tools rather than deleting conversations individually.
The My Account portal enables bulk management of Copilot activity data, allowing users to clear conversation history in a consolidated manner. This approach aligns with Microsoft’s privacy-by-design framework, giving users control over their AI-generated interaction history without requiring administrative intervention.
Using the Copilot web or desktop app would typically require manually deleting conversations one at a time, increasing effort. The Windows 11 Settings app is unrelated to Microsoft 365 Copilot data management.
Therefore, to delete all Copilot conversations efficiently and with the least amount of effort, the correct approach is to use the My Account portal in Microsoft 365.
정답:
Explanation:
Microsoft 365 Copilot operates within the security, compliance, and identity boundaries of a Microsoft 365 tenant. Shared prompts, prompt links, and Copilot artifacts are governed by organizational access controls and tenant isolation. If a prompt is created and shared from outside your organization, cross-tenant access may not be supported depending on the sharing configuration and administrative policies.
When a user attempts to open a prompt that resides in another organization’s tenant without proper cross-tenant sharing permissions, Copilot cannot locate or validate the resource within the user’s own environment. As a result, the system displays a “Prompt not found” message.
Option B would typically result in an access or permissions error rather than the prompt being unavailable entirely. Sensitivity labels and scheduled prompts do not inherently cause a “not found” error. Therefore, the most likely cause is that the prompt exists outside your organization’s tenant boundary and is not accessible to you.
정답:
Explanation:
When building a custom agent in Microsoft 365 Copilot for marketing use cases, the required capabilities must align with the intended outputs. Marketing collateral includes written copy as well as visual assets such as logos and artwork. According to Microsoft AI Business Professional documentation, generative AI solutions that produce visual creative assets require integrated image generation capabilities powered by multimodal AI models.
Option C is correct because an image generator enables the agent to create visual marketing materials such as logos, artwork, and design elements. While templates and suggested prompts can improve usability and consistency, they do not provide the underlying capability to generate images. A code interpreter is designed for data analysis, calculations, or technical scripting tasks and is not relevant to creative marketing asset production.
Therefore, to fulfill the requirement of producing both textual and visual marketing collateral, the most essential addition to the custom agent is an image generator.
정답:
Explanation:
Microsoft 365 Copilot notebooks use the version of a file that was uploaded or attached at the time it was added to the notebook. When a document such as Process.docx is added from a local folder, Copilot references that uploaded snapshot of the file. If the file is later modified locally, the notebook does not automatically sync or refresh with the updated local version unless the updated file is re-uploaded.
Microsoft guidance on grounding and file references explains that Copilot works with the specific content stored in Microsoft 365 or the attached artifact within the notebook session. Since the updated version remains in the local folder and was not reattached, Copilot continues to use the originally added version.
Therefore, during subsequent chats in the notebook, Copilot references only the original uploaded version of Process.docx.

정답: 
Explanation:
The first statement is correct because Microsoft 365 provides transparency and user control through the My Account portal, where users can review their Copilot activity history. This aligns with Microsoft’s responsible AI and data governance principles, ensuring visibility into AI interactions.
The second statement is incorrect. Deleting Copilot activity history removes stored interaction records (such as prompts and responses), but it does not automatically delete associated notebooks, documents, or pages stored in services like OneDrive or SharePoint. Those files remain governed by standard Microsoft 365 retention and lifecycle policies.
The third statement is correct because users can delete their entire Copilot activity history, including both prompts and generated responses. This reinforces enterprise-grade privacy controls and regulatory compliance requirements.
These controls demonstrate core generative AI fundamentals: transparency, user data ownership, security boundaries, and responsible lifecycle management of AI-generated interactions within Microsoft 365 environments.

정답: 
Explanation:
You can continue the conversation in Copilot in Word. Answer. Yes
You can continue the conversation in Copilot for Teams. Answer. Yes
You can continue the conversation in Copilot in Outlook. Answer. Yes
Microsoft 365 Copilot provides cross-application continuity across supported Microsoft 365 apps. When using the Researcher agent, users can continue or extend their work within applications such as Word, Teams, and Outlook.
In Word, users can incorporate research results directly into drafted documents, enabling seamless transition from research to content creation. In Teams, Copilot supports contextual collaboration by allowing insights from research to inform meetings or chats. In Outlook, findings can be referenced to compose informed and data-driven communications.
This interoperability is enabled by Microsoft Graph integration, which allows Copilot to securely access and contextualize enterprise data across applications while respecting organizational permissions.
These capabilities demonstrate effective management of prompts and conversations by maintaining workflow continuity, enhancing productivity, and supporting enterprise-grade collaboration powered by generative AI.

정답: 
Explanation:
Temporary chat in Microsoft 365 Copilot is designed for privacy-focused, non-persistent interactions. When a user initiates a temporary chat session, the conversation is not saved to Copilot chat history and is not retained for future reference within the standard conversation history interface.
Once the session ends or a new chat begins, the prompts and responses from that temporary session are deleted immediately. They are not moved to OneDrive, stored in a notebook, or retained for a delayed deletion period.
This feature supports secure and confidential usage scenarios where users may want to explore ideas, draft sensitive content, or test prompts without creating a stored record in their activity history.
From a generative AI governance perspective, this reflects Microsoft’s design principle of providing configurable persistence levels, balancing productivity with enterprise security, compliance, and user privacy controls.