Group Details

Game Development

Welcome to the Game Development group! You are free to ask anything relating to games/gamedev in this group.

  • The Next 100x Crypto Meme Gem

    PEIPEI 100x GEM in 2025

    be846b82-b105-40a3-8ee2-63cd26aa350c-image.png

    PeiPei Token Analysis: A Rising Meme Coin with Serious Potential

    PeiPei (PEIPEI), a meme coin launched just last month, is quickly becoming a notable contender in the crypto market. Its distinctive blend of meme culture and strategic endorsements has garnered significant attention and substantial growth.

    Major Endorsement from Bitcoin Expert Davinci Jeremie

    One of the most influential factors in PeiPei's early success is the endorsement from Davinci Jeremie, a well-known Bitcoin OG who advised investing in Bitcoin back in 2011. Jeremie's endorsement is not just superficial; he has proclaimed himself as the “official ambassador” of PeiPei and has invested heavily in the token. His Etherscan account reveals holdings of over $775,000 worth of PEIPEI, equivalent to 2.1 trillion tokens. Such a significant endorsement from a seasoned crypto expert underscores PeiPei's potential.

    Impressive Market Performance

    Since its launch on June 5, PeiPei has shown remarkable growth. The token’s price surged by 997%, pushing its market cap close to $150 million. Currently, PEIPEI trades at approximately $0.000000363 per token and ranks as the 10th most traded meme coin globally. Over the past month, PeiPei's price has surged 222.40%, securing the 284th position in the global cryptocurrency market list with a capitalization of $147.70 million. This bullish sentiment is supported by the token’s performance within an ascending channel pattern, suggesting potential for further growth.

    Unique Design and Ambitious Roadmap

    PeiPei sets itself apart by incorporating elements of Chinese culture with the iconic Pepe the Frog meme. This unique design has attracted a significant following, with over 22,000 Twitter followers. The development team’s roadmap includes ambitious plans for partnerships, marketing campaigns, and even the audacious goal of challenging the US dollar. The token's supply is capped at 420.60 trillion, with no buy/sell tax on trades, enhancing its appeal to investors.

    Technical Analysis and Market Sentiment

    From a technical perspective, PeiPei's price is bolstered by its SMA indicator, which acts as a support in the daily time frame. Although the Relative Strength Index (RSI) has not breached the overbought range, indicating mixed sentiment, the overall trend remains positive. The token has shown a consistent increase in trading volume, with a 7% price rise in just one day and a 67.79% surge over the past week. Its trading volume reached $208.7 million, and it recently recorded an all-time high (ATH) of $0.0000003943.

    Conclusion: A Promising Future
    PeiPei's rapid rise, combined with strategic endorsements and a robust market performance, suggests that it is more than just another meme coin. With Davinci Jeremie's backing and a unique cultural twist, PeiPei is well-positioned to continue its upward trajectory. However, like all meme coins, its future will depend on maintaining momentum and investor interest. Given its current trajectory, PeiPei is undoubtedly a token to watch in the coming months.

    Find more info

    posted in Crypto
  • RE: The third-party login system is not working on LankaDevelopers.lk.

    ohh thanks for the reporting, we'll solve asap;.

    posted in Comments & Feedback
  • RE: TALL STACK - Application optimize

    what you mean optimize ?

    posted in Laravel Framework
  • RE: Mora UXplore 1.0 - University of Moratuwa IEEE Student Branch

    Glad here that, thanks for the sharing ,

    posted in Events ( Free & Paid )
  • Mora UXplore 1.0 - University of Moratuwa IEEE Student Branch

    c516f658-5c2a-4806-aac1-71297db5b0fe-WhatsApp Image 2023-04-12 at 11.06.15.jpg

    " Even the greatest was once a beginner. Don't be afraid to take that first step "
    - Muhammad Ali

    Mora UXplore 1.0 is a UI/UX designing competition for undergraduates from any university. It aims to offer a platform for them to explore, learn, and enhance their skills in this field. 🤩✨

    🔴 REGISTRATION IS NOW OPEN 🔴

    Click the link below and take the first step to be the best👇
    https://morauxplore.lk

    For more info join with our Facebook Event:
    https://fb.me/e/2AyV3rg8Q

    Caption by- D.M.M. Shehan
    Design by- Dilina Raveen

    -Inspired by PASSION to Transform beyond EXCELLENCE-

    #MoraUXplore1.0
    #IEEESBUOM
    #TERM22/23

    posted in Events ( Free & Paid )
  • RE: Join the Festivities! Win a Free T-Shirt by Becoming a Part of Lanka Developers Community this New Year!

    Done!

    #NewYearGiveaway
    #WinWithLankaDevelopers
    #JoinOurCommunity
    #FreeTShirts
    #ContributeToWin

    posted in Announcements
  • RE: Lanka Developers Community T-shirts

    @kupeshanth

    you can place the order here

    posted in Announcements
  • LocalAI: A drop-in replacement for OpenAI

    LocalAI

    LaMA, alpaca, gpt4all, vicuna, koala, gpt4all-j


    Self-hosted, community-driven simple local OpenAI-compatible API written in go. Can be used as a drop-in replacement for OpenAI, running on CPU with consumer-grade hardware. Supports ggml compatible models: LLaMA, alpaca, gpt4all, vicuna, koala, gpt4all-j


    Using LocalAI is straightforward and easy. You can simply install LocalAI on your local machine or server via docker and start performing inferencing tasks immediately, no more talkings let's start ,

    1. Install docker in your pc or server ( installation depend on the os type, check here)

    2. Open terminal or cmd and clone the LocalAi repo from github

        git clone https://github.com/go-skynet/LocalAI
      
    3. Go to the LocalAi/models folder in terminal

        cd LocalAi/models
      
    4. Download the model ( in here i use gpt4all-j model , this model coming with Apache 2.0 Licensed , it can be used for commercial purposes.)

       wget https://gpt4all.io/models/ggml-gpt4all-j.bin
      

      in here i use wget for download, you can download bin file manually and copy paste to the LocalAi/models folder

    5. Come back to the LocalAi root

    6. Start with docker-compose

         docker compose up -d --build
      

    After above process finished, let's call our LocalAi via terminal or cmd , in here i use curl you can use also any other tool can perform http request ( postman, etc.. )

    curl http://localhost:8080/v1/models
    

    This request showing what are the models we have added to the models directory

    Let's call the AI with actual Prompt.

    curl http://localhost:8080/v1/completions -H "Content-Type: application/json" -d '{
         "model": "ggml-gpt4all-j",            
         "prompt": "Explain AI to me like A five-year-old",
         "temperature": 0.7
       }'
    

    Windows compatibility

    It should work, however you need to make sure you give enough resources to the container. See

    Kubernetes
    You can run the API in Kubernetes, see an example deployment in kubernetes


    API Support
    LocalAI provides an API for running text generation as a service, that follows the OpenAI reference and can be used as a drop-in. The models once loaded the first time will be kept in memory.

    Example of starting the API with docker:

    docker run -p 8080:8080 -ti --rm quay.io/go-skynet/local-api:latest --models-path /path/to/models --context-size 700 --threads 4
    

    Then you'll see:

    ┌───────────────────────────────────────────────────┐ 
    │                   Fiber v2.42.0                   │ 
    │               http://127.0.0.1:8080               │ 
    │       (bound on host 0.0.0.0 and port 8080)       │ 
    │                                                   │ 
    │ Handlers ............. 1  Processes ........... 1 │ 
    │ Prefork ....... Disabled  PID ................. 1 │ 
    └───────────────────────────────────────────────────┘ 
    

    if you want more info about API, go to the github page
    https://github.com/go-skynet/LocalAI#api

    posted in Artificial Intelligence
  • RE: Do anyone works with cardcom payment gateway?

    @rohandhananjaya

    your welcome,

    do some reverse engineering for the laravel-cardcom. i think you can convert this to php package removing laravel facade. if you want i'll help you.

    posted in Web Development
  • RE: Do anyone works with cardcom payment gateway? posted in Web Development