AI Usage Policy

Your Ghost Production does not allow fully AI-generated tracks, AI-generated music parts, or AI-generated stems.

The only AI-related exception currently allowed is AI vocals, and only under strict conditions, disclosure, and platform review. If AI is used, the producer must disclose that usage and provide the AI service name where required. AI-cloned vocals of real artists are not allowed. Udio vocals are disallowed under the current policy. Producers must also confirm that restricted AI vocal services were not used during the submission process.

This policy exists because Your Ghost Production is a marketplace for ready-to-release tracks, not a dumping ground for prompt-generated music. Buyers need to know that tracks listed on the platform follow clear rules around originality, vocal sources, AI disclosure, and rights-sensitive material. Producers also need clear boundaries before submitting music for moderation.

The short version is simple:

Fully AI-generated tracks are not allowed.

AI-generated music parts are not allowed.

AI-generated stems are not allowed.

AI-cloned vocals of real artists are not allowed.

Udio vocals are not allowed.

Compliant AI vocals may be allowed only if properly disclosed and accepted under the platform rules.

That is the standard producers must follow and buyers should understand.

Why Your Ghost Production has an AI policy

AI has made music rights, originality, and buyer trust more complicated.

A track can now be generated from a prompt. A vocal can be created without a singer. A voice can be made to sound like a real artist. A stem can be created by an AI system. A melody, chord progression, or instrumental section can be generated by software and then edited by a human afterward.

For a casual listener, some of this may not matter. For a ghost production marketplace, it matters a lot.

Buyers are not only listening to music for entertainment. They may purchase a track, release it under their own artist name, distribute it commercially, pitch it to labels, use it in DJ sets, build content around it, or include it in a larger artist strategy. They need to understand what they are buying.

Producers also need a clear rule set. Without a policy, one producer might submit a fully AI-generated track, another might use AI-generated stems, another might use an AI clone of a famous singer, and another might use a compliant AI vocal tool but disclose nothing. That would create a messy and unsafe catalog.

YGP’s AI policy protects the platform by drawing a clear line between accepted production work and restricted AI-generated content.

Fully AI-generated tracks are not allowed

A fully AI-generated track is not allowed on Your Ghost Production.

This means a producer cannot generate a complete song or instrumental through an AI system and submit it as a normal ghost production track. The track cannot be primarily created by AI and then sold as if it were a human-produced, ready-to-release production.

This is one of the most important rules in the policy.

A fully AI-generated track can create too many unanswered questions. What rights does the producer actually control? What does the AI service allow? Could another user create a similar output? Was copyrighted training material involved? Will the buyer’s distributor accept the track? Will a label reject it? Can the buyer confidently release it under their artist identity?

Those questions can become serious after purchase.

A buyer should not discover after checkout that the music was generated by an AI system. A producer should not place that risk on the buyer or the platform. If the track is fully AI-generated, it should not be submitted to YGP.

AI-generated music parts are not allowed

YGP also does not allow AI-generated music parts.

This rule is important because a track does not need to be fully AI-generated to create problems. A producer might generate only the drop, a melody, a bassline, a chord progression, a backing instrumental, a synth section, or a musical hook through AI, then build around it. Under YGP’s current policy, AI-generated music parts are not allowed.

That means the restriction applies to more than the final master.

A producer cannot use AI to generate musical content and then present that music as a normal submitted production. Editing the AI-generated part afterward does not automatically make it acceptable. Re-processing, re-arranging, layering, or mixing an AI-generated musical idea does not remove the fact that the part came from AI.

This protects the marketplace from partial AI productions being sold as human-created ghost productions.

For producers, the rule is direct: if AI generated the musical part, do not use it in a submitted YGP track.

AI-generated stems are not allowed

AI-generated stems are not allowed.

Stems are important in ghost production because buyers often use them for edits, live versions, alternate mixes, vocal adjustments, arrangement changes, and release preparation. If the stems were generated by AI, the buyer may inherit AI-related uncertainty inside the delivered package.

YGP’s verified context specifically bans AI-generated stems.

That means producers should not generate stem material through AI and include it as part of a track submission. This applies whether the stem is a vocal-like layer, instrumental bed, drum section, melody, texture, backing part, or other musical element generated by an AI tool.

If a buyer receives stems, they should be able to treat them as part of the delivered production package under the track’s purchase terms. AI-generated stems would weaken that trust.

AI vocals may be allowed, but only under strict conditions

AI vocals are the only AI-related exception currently allowed by YGP, and that exception is narrow.

AI vocals may be accepted only if they are compliant, properly disclosed, and do not violate the platform’s restrictions. AI usage disclosure is required. If AI is used, the AI service name is required. The submission process also requires confirmation that restricted AI vocal services were not used.

This means producers cannot treat AI vocals as a loophole.

An AI vocal is not automatically acceptable just because the instrumental was human-produced. The vocal still needs to fit the policy. The producer must disclose it, identify the service where required, avoid restricted tools, and avoid any real-artist cloning or impersonation.

For buyers, this distinction matters. A track with compliant disclosed AI vocals may still be allowed on YGP, but it is not the same as a track with original human vocals. Buyers should check the vocal information before purchase and decide whether that type of vocal fits their release plan.

AI-cloned vocals of real artists are not allowed

AI-cloned vocals of real artists are not allowed on YGP.

This rule is non-negotiable.

A producer cannot use AI to imitate a known singer, celebrity, artist, rapper, vocalist, public figure, or recognizable real person. It does not matter whether the result sounds impressive. It does not matter whether the voice is slightly edited. It does not matter whether the producer thinks the track will sell faster because the voice sounds familiar.

Real-artist voice cloning creates serious risk.

It can mislead buyers. It can create publicity and likeness issues. It can cause takedowns or distributor problems. It can damage the buyer’s artist brand. It can create legal conflict. It can make the marketplace look unsafe.

A buyer should be especially cautious with any vocal that sounds like a famous artist. If something feels suspicious, the buyer should contact support before purchasing or releasing the track.

Vocal impersonation is not allowed

YGP’s vocal rules also prohibit vocal impersonation and voice-cloning of real artists. All rights and permissions must be in place before submission.

This matters because AI voice issues are not limited to exact technical cloning.

A producer may try to create a vocal that strongly imitates a real artist without saying it is a clone. That is still a problem if the result is designed to sound like a recognizable real person. A marketplace cannot rely only on the technical label used by the producer. The practical effect matters too.

If a vocal is built to make buyers think of a specific real artist, that is not acceptable.

YGP’s policy is designed to prevent tracks from being sold on the strength of someone else’s recognizable voice or identity.

Udio vocals are disallowed

Udio vocals are disallowed under YGP’s current policy.

This is a specific rule producers must follow. Even though compliant AI vocals may be allowed in some cases, Udio vocals are not allowed as part of that exception.

A producer should not submit a track that uses Udio vocals. A producer also should not hide the service name or describe the vocal vaguely to avoid review. If AI was used, the AI service name is required.

For buyers, this is useful to know because it shows that YGP’s AI vocal exception is not open-ended. The platform does not simply allow any AI vocal from any tool. The AI vocal must fit the allowed path.

AI usage disclosure is required

If AI is used, disclosure is required.

YGP’s submission flow requires AI usage disclosure. If AI was used, the AI service name is required.

This is one of the most important parts of the policy because disclosure gives the platform and buyer a clearer understanding of the track’s origin. Without disclosure, a buyer may purchase a track without knowing that AI vocals were involved. A label may ask questions later. A distributor may require information. A buyer may have a personal or brand reason to avoid AI vocals. A support issue may become harder to resolve.

Disclosure protects everyone.

It protects buyers by giving them information before release.

It protects producers by making the submission honest.

It protects the platform by allowing moderation to review the track under the correct rules.

A producer who uses allowed AI vocals but hides the AI usage is still violating the spirit of the policy. The exception depends on disclosure.

What producers must disclose

Producers must disclose AI usage where required and provide the AI service name if AI was used.

For AI vocals, that means the producer should be clear about the vocal source and the tool or service used. The producer must not submit the track as if the vocal were original human vocals, royalty-free sample-pack vocals, or self-recorded vocals if that is not true.

The producer should also make sure the rest of the vocal source information is accurate. YGP’s vocal rules require producers to declare vocal source type. Original vocals require vocalist or source details where required. Royalty-free or sample-pack vocals require the sample pack name and URL through provenance links if no vocalist source is provided.

That means AI disclosure is part of a wider provenance standard.

The platform needs to know where the vocal came from, whether the vocal is allowed, and whether the producer has the right to submit the track.

What producers must not submit

Producers must not submit:

fully AI-generated tracks

AI-generated instrumentals

AI-generated music parts

AI-generated stems

AI-generated drops

AI-generated melodies

AI-generated chord progressions

AI-generated backing tracks

AI-cloned vocals of real artists

AI vocals that impersonate real artists

Udio vocals

tracks with hidden AI usage

tracks using restricted AI vocal services

tracks with unclear vocal source information

tracks where rights or permissions are not in place

The safest producer rule is simple: if the track depends on AI-generated music content, do not submit it. If the track uses AI vocals, submit only if the vocal is compliant, disclosed, and allowed under the current policy.

What buyers should understand about AI vocals

Buyers should understand that an AI vocal is different from an original human vocal or a royalty-free sample-pack vocal.

A compliant AI vocal may be allowed on YGP, but buyers should still consider whether it fits their release strategy. Some artists may be comfortable using AI vocals. Others may avoid them completely. Some labels may have strict rules. Some distributors may require disclosure. Some brand campaigns may reject AI-related material.

That does not mean every AI vocal is unusable. It means buyers should make an informed decision.

Before buying a track with vocals, check:

whether the track contains vocals

what type of vocal is disclosed

whether AI was used

which AI service was used, if shown or required

whether the vocal sounds like a real artist

whether the track fits your label or distributor requirements

whether you are comfortable releasing music with AI vocals

whether you need support clarification before purchase

A buyer should not wait until after release to ask these questions.

What buyers should understand about non-AI vocals

Not every vocal concern is about AI.

A vocal may be original, royalty-free, sample-pack based, or AI-generated under allowed conditions. Each category has its own rights and disclosure needs.

For original vocals, producers must provide vocalist or source details where required. For royalty-free or sample-pack vocals, if no vocalist source is provided, producers must provide the sample pack name and URL through provenance links. Vocal impersonation and voice-cloning of real artists are not allowed. All rights and permissions must be in place before submission.

Buyers should not assume all vocals are unique unless the vocal source rules and track information support that. A royalty-free vocal may be legally usable under certain terms but may not be unique to one buyer. An original vocal may have different considerations. An AI vocal may require specific disclosure.

The right approach is to check the vocal type and terms before buying.

What this policy means for track originality

YGP’s AI policy supports originality by keeping fully AI-generated music out of the catalog.

A ghost production marketplace should not be filled with prompt-generated tracks that anyone could create through the same tool. Buyers are looking for music with real production decisions, genre understanding, arrangement work, sound design, mixing choices, and professional delivery.

Banning AI-generated tracks, AI-generated music parts, and AI-generated stems helps protect that expectation.

This does not mean AI is the only originality risk. Producers still need to avoid copied melodies, stolen drops, uncleared samples, unauthorized vocals, fake rights claims, and misleading metadata. But AI-generated music content is a specific risk category, and YGP’s current policy addresses it directly.

The result is a cleaner standard for the marketplace.

What this policy means for buyer rights

AI policy and buyer rights are connected, but they are not the same thing.

The track-specific rights badge and purchase terms define what rights the buyer receives. On YGP, the site may show rights badges such as “Royalty-free / commercial-use track” or “Non-exclusive beat.” The practical intent in the current setup is that buyers can release and use purchased tracks commercially under their own brand or artist identity, according to the purchase terms shown or linked at checkout.

AI policy defines what kind of AI material can be submitted in the first place.

A track could have commercial-use rights but still create problems if it contained undisclosed or disallowed AI material. That is why the platform needs both rights badges and AI rules.

Buyers should check both sides:

What rights does the track provide?

What does the track disclose about AI or vocals?

A clean purchase decision depends on both.

What this policy means for producer moderation

AI disclosure is part of the moderation process.

Producers on YGP apply, get approved, complete onboarding, sign the agreement, set payout details, create a track draft, upload required deliverables, fill metadata and provenance, AI, and vocal disclosures, then submit for moderation. After submitting, editing and uploads lock until a decision.

This matters because AI policy is not just a public statement. It is tied to the actual producer submission flow.

A producer cannot submit a track first and explain later if the AI usage is unclear. The required disclosures should be handled during submission. Once submitted, the track enters moderation, and editing/uploads lock until a decision.

If a track uses AI in a way that violates policy, it should not be submitted.

Can AI be used for mastering, mixing, cleanup, or technical assistance?

The verified YGP policy confirms that fully AI-generated tracks, AI-generated music parts, and AI-generated stems are banned, and that compliant disclosed AI vocals may be allowed under strict rules.

The provided control context does not confirm a detailed rule for every non-generative or assistive AI tool, such as AI-assisted mastering, AI mixing assistants, AI repair tools, AI transcription, AI noise reduction, or AI sample detection.

NEEDS OWNER CONFIRMATION: Whether AI-assisted mastering, AI mixing tools, AI repair tools, AI transcription, AI sample detection, or other non-generative AI production utilities are allowed, restricted, or require disclosure under the current YGP submission policy.

Until that is confirmed, the public policy should avoid making claims that are not verified. The safe statement is that AI-generated music content is not allowed, AI-generated stems are not allowed, and only compliant disclosed AI vocals may be allowed.

Can AI be used for artwork or listing images?

The verified AI policy provided here is focused on AI-generated tracks, music parts, stems, and vocals. It does not confirm a separate rule for AI-generated artwork or listing images.

NEEDS OWNER CONFIRMATION: Whether AI-generated artwork is allowed, restricted, discouraged, or separately moderated for YGP track listings.

Because this page is about AI usage in music submissions, it should not invent an artwork policy. If YGP has a separate rule for artwork, it should be added to the platform documentation and linked from this article.

What happens if a producer breaks the AI policy?

The control document confirms the AI restrictions and disclosure requirements, but it does not provide a complete enforcement ladder for every violation.

NEEDS OWNER CONFIRMATION: What exact enforcement actions apply when a producer submits disallowed AI-generated tracks, hidden AI usage, AI-generated stems, AI-cloned real-artist vocals, Udio vocals, or false AI disclosures.

Until that enforcement process is confirmed, this article should avoid inventing penalties. The safe wording is that producers must follow the AI policy and should not submit tracks that violate it. Tracks that do not meet the rules may fail moderation or require platform review.

A public policy should be firm about what is allowed and not allowed, without inventing account penalties that may not be documented yet.

What happens if a buyer spots possible AI misuse?

If a buyer sees or hears something suspicious, they should contact support.

Examples include:

a vocal that sounds like a famous artist

unclear AI disclosure

a track that appears to be generated by AI

a vocal source that does not match the listing

a track that seems to use disallowed AI material

metadata that conflicts with the track information

missing or vague provenance information

YGP’s verified context says producers are responsible for accurate metadata and rights disclosures, and YGP can moderate, but mistakes can happen. Users should contact support if they spot an issue.

This is the correct standard. The platform has rules and moderation, but users should report concerns so they can be checked.

Does YGP guarantee that every track has zero AI issues?

No platform should promise that every track detail is guaranteed perfect.

YGP requires disclosures and has rules, but producers are responsible for accurate metadata and rights information. YGP can moderate, but mistakes can happen. If a user spots an issue, they should contact support.

That is a more honest and professional standard than pretending errors are impossible.

The point of the AI policy is to reduce risk, create clear rules, require disclosure, and give buyers a better basis for trust. It is not a claim that no producer could ever submit incorrect information.

Buyers should still read track details carefully, especially for vocal tracks or anything that may involve AI.

Why the AI vocal exception needs strict limits

The AI vocal exception exists because vocals can be handled differently from fully generated tracks, but the exception needs strict limits to stay safe.

A human-produced instrumental with a compliant disclosed AI vocal is different from a fully AI-generated track. But the vocal can still create risk if it imitates a real artist, uses a disallowed service, lacks disclosure, or conflicts with release requirements.

That is why the policy does not simply say “AI vocals allowed.”

It says AI vocals are allowed only if compliant. AI usage must be disclosed. The AI service name is required if AI was used. AI-cloned vocals of real artists are not allowed. Restricted AI vocal services must not be used. Udio vocals are disallowed.

This keeps the exception narrow and reviewable.

How this policy protects buyers

The policy protects buyers by making AI-related restrictions clear before purchase.

A buyer should not have to guess whether a track is prompt-generated. A buyer should not have to worry that stems were generated by AI and quietly included in the package. A buyer should not unknowingly buy a track with a cloned celebrity voice. A buyer should not discover hidden AI usage after pitching to a label.

No policy can remove every possible risk, but this one creates a stronger marketplace standard.

Buyers can make better decisions when the platform bans high-risk AI-generated music content and requires disclosure for the narrow AI vocal exception.

How this policy protects producers

The policy also protects serious producers.

Human production takes skill, time, taste, and technical ability. If a marketplace allowed fully AI-generated tracks next to producer-made music, it would lower the value of real production work and create unfair competition.

By banning fully AI-generated tracks, AI-generated music parts, and AI-generated stems, YGP protects producers who are actually making music.

The policy also tells producers what is expected before they submit. That saves time and reduces confusion. A producer who knows the rules can prepare compliant tracks and avoid wasting moderation time on disallowed material.

How this policy protects the marketplace

A ghost production marketplace depends on trust.

Buyers need to trust the track information. Producers need to trust that the platform has standards. The platform needs to avoid rights confusion, fake originality claims, and low-quality AI-generated catalog flooding.

AI-generated music can scale quickly. Without rules, a marketplace could be filled with tracks that are not properly produced, not properly disclosed, or not safe for buyers to release. That would damage the catalog and the brand.

YGP’s policy keeps the marketplace focused on ready-to-release tracks submitted by approved producers, with clear restrictions around AI-generated content.

Buyer checklist before purchasing a track with vocals

Before buying a vocal track, check:

Is the track vocal or instrumental?

What vocal type is shown?

Is AI usage disclosed?

If AI was used, is the AI service identified where required?

Does the vocal sound like a real artist?

Does the track fit your distributor or label requirements?

Do the rights badge and purchase terms fit your intended use?

Do you need clarification from support before buying?

If you do not want AI vocals, avoid tracks where AI vocals are disclosed or ask for clarification before purchase.

Producer checklist before submitting a track

Before submitting a track to YGP, producers should ask:

Was the full track generated by AI?

Were any music parts generated by AI?

Were any stems generated by AI?

Were any vocals generated by AI?

If AI vocals were used, are they compliant?

Was the AI service name provided?

Were any restricted AI vocal services used?

Were Udio vocals used?

Does the vocal imitate or clone a real artist?

Are all vocal sources and rights clear?

Are all AI and provenance disclosures accurate?

If the answer creates a policy problem, do not submit the track.

The simple answer

YGP’s AI Usage Policy is strict: fully AI-generated tracks, AI-generated music parts, and AI-generated stems are not allowed.

The only AI-related exception is AI vocals, and only if they are compliant, properly disclosed, and do not clone or impersonate real artists. If AI is used, the AI service name is required. Udio vocals are disallowed under the current policy.

This policy exists to protect buyers, serious producers, and the marketplace itself.

Your Ghost Production is built for ready-to-release tracks from approved producers. AI-generated music content does not fit that standard. Compliant AI vocals may be allowed, but only under clear disclosure and strict limits.

FAQ
Does Your Ghost Production allow fully AI-generated tracks?

No. Fully AI-generated tracks are not allowed under YGP’s current policy.

Are AI-generated music parts allowed?

No. AI-generated music parts are not allowed.

Are AI-generated stems allowed?

No. AI-generated stems are not allowed.

Are AI vocals allowed?

AI vocals may be allowed only if they are compliant, properly disclosed, and follow YGP’s rules.

Does YGP allow AI-cloned vocals of real artists?

No. AI-cloned vocals of real artists are not allowed.

Are Udio vocals allowed?

No. Udio vocals are disallowed under the current policy.

Does a producer have to disclose AI usage?

Yes. AI usage disclosure is required. If AI is used, the AI service name is required.

Does YGP ban all AI?

No. The accurate wording is that fully AI-generated tracks, AI-generated music parts, and AI-generated stems are banned. Compliant disclosed AI vocals may be allowed under strict conditions.

Can AI mastering or AI mixing tools be used?

NEEDS OWNER CONFIRMATION. The verified policy confirms restrictions on AI-generated tracks, music parts, stems, and AI vocals, but it does not confirm the rule for AI-assisted mastering, AI mixing, repair, transcription, or other non-generative tools.

What should I do if I suspect a track uses disallowed AI?

Contact support with the track title, listing information, and a clear explanation of the concern. Producers are responsible for accurate disclosures, and users should report issues if they spot them.

Select a track to preview
Idle