Try to avoid them in the first place by using derived tables or CTEs
Table variables are easy an convenient, because you don't have to clean them up.
But only use them with small sets (smaller 1000 rows)
Do not use SELECT col1 INTO #temptable FROM dbo.table1
But do use a create table statement and a INSERT/SELECT:
CREATE #temptable (col1 int,col2 int)
INSERT INTO #temptable (col1,col2) SELECT Col1, Col2 FROM dbo.table1
a bit more work, but less blocking in environments with a higher load.
With larger sets you could consider using indexes and check with the query plan and statistics io to see if there are any benefits in using them.
Create these indexes after the table is filled with a set:
INSERT INTO #temptable (col1,col2) SELECT Col1,Col2 FROM dbo.table1
CREATE CLUSTERED INDEX CIX_#temptable ON #temptable (col1)
CREATE INDEX IX_#temptable_1 ON #temptable (Col2)
Create ony the indexes you absolutely need.
Configure TempDB as suggested in my post configure TempDB
Can your TempDB database disks handle the current load? See my post on fn_virtualfilestats
Prefer locally (#temptable) over globaly (##temptable) scoped definitions
Store results of linked servers in TempDB objects instead of accessing the remote datasource multiple times. See my post on linked servers
Be a good citizen, at the end of your stored procedure or T-SQL code always cleanup after your self and drop all temporarily objects you have created.
--drop temp tables if they exists
IF OBJECT_ID('tempdb..#temptable') IS NOT NULL
DROP TABLE #temptable
IF OBJECT_ID('tempdb..#temptable') IS NOT NULL
DROP TABLE #temptable